ConversationTokenBufferMemory
Author: Kenny Jung
Design: Kenny Jung
Peer Review : Wooseok Jeong, JeongGi Park
Proofread : Juni Lee
This is a part of LangChain Open Tutorial
Overview
ConversationTokenBufferMemory stores recent conversation history in a buffer memory and determines when to flush conversation content based on token length rather than the number of conversations.
Key parameters:
max_token_limit: Sets the maximum token length for storing conversation contentreturn_messages: When True, returns the messages in chat format. When False, returns a stringhuman_prefix: Prefix to add before human messages (default: "Human")ai_prefix: Prefix to add before AI messages (default: "AI")
Table of Contents
References
Environment Setup
Set up the environment. You may refer to Environment Setup for more details.
[Note]
langchain-opentutorialis a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.You can checkout the
langchain-opentutorialfor more details.
You can alternatively set OPENAI_API_KEY in .env file and load it.
[Note] This is not necessary if you've already set OPENAI_API_KEY in previous steps.
Limiting Maximum Token Length to 50
This section demonstrates how to limit the conversation memory to 50 tokens
Setting Maximum Token Length to 150
Let's check how the conversation is stored when we set the maximum token length to 150.
Last updated