ConversationBufferMemory

Open in ColabOpen in GitHub

Overview

This tutorial introduces ConversationBufferMemory, a memory class that stores conversation history in a buffer.

Typically, no additional processing is required. Sometimes, however, it may be necessary when the conversation history exceeds the model's context window.

In this tutorial, we will learn how to use ConversationBufferMemory to store and retrieve conversation history.

Table of Contents

References


Environment Setup

Set up the environment. You may refer to Environment Setup for more details.

[Note]

  • langchain-opentutorial is a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.

  • You can checkout the langchain-opentutorial for more details.

You can alternatively set OPENAI_API_KEY in .env file and load it.

[Note]

  • This is not necessary if you've already set OPENAI_API_KEY in previous steps.

Extracting messages as strings

After storing conversation messages, this memory allows you to extract messages into a variable.

You can use the save_context(inputs, outputs) method to save conversation records.

  • This method accepts two arguments, inputs and outputs.

  • inputs stores the user's question, and outputs stores the AI's answer.

  • The conversation record is stored internally under the history key.

  • You can then use the load_memory_variables method to retrieve and inspect the saved conversation history.

The load_memory_variables({}) method of the memory object returns the complete message history as a string.

Extracting messages as HumanMessage and AIMessage objects

Setting return_messages=True returns HumanMessage and AIMessage objects.

Applying to a Chain

Let's apply ConversationBufferMemory to a ConversationChain.

Proceed with the conversation using the ConversationChain.

Verify if the system remembers the previous conversation.

Last updated