RunnableWithMessageHistory
Author: Secludor
Peer Review:
Proofread : Chaeyoon Kim
This is a part of LangChain Open Tutorial
Overview
RunnableWithMessageHistory in LangChain's Expression Language (LCEL) for managing conversation history in chatbots, virtual assistants, and other conversational AI applications. It seamlessly integrates with existing LangChain components to automatically handle message history management and updates.
Key Features
Message History Management
Maintains conversation context across multiple interactions.
Automatically tracks and stores chat messages.
Enables contextual responses based on previous conversations.
Flexible Input/Output Support
Handles both message objects and Python dictionaries.
Supports various input formats, including:
Single messages
Message sequences
Dictionary inputs with custom keys
Provides consistent output handling regardless of input format.
Session Management
Manages conversations through unique identifiers, such as:
Simple session IDs
Combined user and conversation IDs
Maintains separate conversation threads for different users or contexts.
Ensures conversation continuity within the same session.
Storage Options
Offers in-memory storage for development and testing.
Supports persistent storage (e.g., Redis, files) for production environments.
Provides easy integration with various storage backends.
Advantages Over Legacy Approaches
More flexible than the older ConversationChain.
Offers better state management.
Provides improved integration with modern LangChain components.
Summary
RunnableWithMessageHistory is the recommended standard for conversation management in LangChain, offering:
Simplified conversation state management.
An enhanced user experience through context preservation.
Flexible configuration options for diverse use cases.
Table of Contents
References
Environment Setup
Setting up your environment is the first step. See the Environment Setup guide for more details.
[Note]
The
langchain-opentutorialis a bundle of easy-to-use environment setup guidance, useful functions and utilities for tutorials.Check out the
langchain-opentutorialfor more details.
Alternatively, you can set and load OPENAI_API_KEY from a .env file.
[Note] This is only necessary if you haven't already set OPENAI_API_KEY in previous steps.
Getting Started with RunnableWithMessageHistory
RunnableWithMessageHistoryManaging conversation history is crucial for conversational applications and complex data processing tasks. RunnableWithMessageHistory simplifies the message history implementation. To use it effectively, you need these two key components:
Runnable objects,
Creating Runnable objects, such as
retrieverorchain, are the primary components that interacts withBaseChatMessageHistory.
Message History Manager (callable)
This is a callable that returns an instance of
BaseChatMessageHistory. It handles message storage, retrieval, updates, and maintains conversation context for contextual responses.
Implementation Options
LangChain offers several implementations for managing message history. You can explore various memory integrations for persistent storage, as documented in the LangChain's message histories: memory integrations page.
This tutorial covers two primary approaches in implementation:
In-Memory
ChatMessageHistoryManages message history in memory, making it ideal for development and simple applications.
Provides fast access speeds.
Message history is lost on application restart.
Persistent Storage with
RedisChatMessageHistoryEnables permanent message storage using Remote Dictionary Server (Redis), a high-performance, open-source in-memory data structure store.
Suitable for distributed environments.
Ideal for complex applications and long-running services.
Consider these factors when selecting a message history management approach:
Application requirements
Expected traffic volume
Message data importance
Retention period requirements
While in-memory implementation offers simplicity and speed, persistent storage solutions like Redis are more appropriate when data durability is a concern.
Understanding In-Memory Conversation History
In-memory conversation history provides a simple and fast way to manage chat message history during development and testing. This approach stores conversation data in memory, offering quick access but without persistence across application restarts.
Core Configuration Parameters
Required Components
runnable: The chain or model (e.g., ChatOpenAI) to execute.get_session_history: A function returning aBaseChatMessageHistoryinstance.input_messages_key: Specifies the key for user input ininvoke()calls.history_messages_key: Defines the key for accessing conversation history.
Default Session Implementation
RunnableWithMessageHistory uses session_id as its default identifier for managing conversation threads, as shown in its core implementation:
Using Session Management
To utilize session management, specify a session ID in your invoke call:
Using the same session_id continues the conversation by retrieving the previous thread's content (this continuous conversation is called a session):
However, using a different session_id will result in an inaccurate response because there is no corresponding history.
For example, if session_id is def234 and no history exists for that ID, you'll see an irrelevant response (see the following code snippet).
You can customize the configuration parameters for tracking message history by passing a list of ConfigurableFieldSpec objects through the history_factory_config parameter.
Setting a new history_factory_config overrides the existing session_id configuration.
The following example demonstrates using two parameters: user_id and conversation_id.
Let's try a custom configuration.
Example of Runnables Using Different Keys
This example demonstrates how to handle inputs and output messages with RunnableWithMessageHistory.
Messages Input with Dictionary Output
Direct Message Object Handling
Omitting
input_messages_key="input"configures the system to acceptMessageobjects as input.
This configuration provides:
Direct handling of the input
Messageobject.Outputting data in a dictionary format.
Maintaining conversation history across sessions.
Continuing conversations seamlessly using session IDs.
Message Objects for both Input and Output
Message Objects for both Input and OutputContinuing from the previous example, you can also configure RunnableWithMessageHistory to handle Message objects directly for both input and output.
Direct Message Object Handling
Omitting
output_messages_key="output_message"configures the system to returnMessageobjects as output.
Dictionary with Single Key for All Messages
Using a Single Key for Input/Output
This approach uses one key for both input and output messages.
It utilizes
itemgetter("input_messages")to extract input messages from the dictionary.
This configuration enables:
Direct handling of
Messageobjects.Simplified input/output processing.
Flexible conversion between different message formats.
Consistent session management.
Understanding Persistent Storage
Persistent storage ensures that data is retained even after a program terminates or the system restarts . This is typically achieved using databases, file systems, or other non-volatile storage devices.
Persistent storage is essential for long-term data preservation in applications. It enables.:
State preservation across sessions.
User preference retention.
Continuous operation without data loss .
Recovery from previous execution points.
Implementation Options
RunnableWithMessageHistory offers flexible storage options that are independent of how get_session_history retrieves the chat message history.
It supports the local file system (see an example here)
It integrates with various storage providers (see LangChain's message histories: memory integrations)
Using Redis for Persistence
This section demonstrates how to use Redis for persistent message history storage.
Installation
Redis Server Setup
Launch a local Redis Stack server using Docker:
Configuration options:
-d: Run in daemon mode (background).-p {port}:6379: Redis server port mapping.-p 8001:8001: RedisInsight UI port mapping.redis/redis-stack:latest: Latest Redis Stack image.
Tips for Troubleshooting
Verify Docker is running.
Check port availability (terminate any processes using the port or use different ports).
Redis Connection
Set up the Redis connection URL:
"redis://localhost:{port}/0"
Implementing Redis Message History
To use Redis for message history, define a new callable that returns an instance of RedisChatMessageHistory :
Testing Conversation Continuity
First Interaction
You can call the function / chain as before.
Continuing the Conversation
Make the second call using the same
session_id.
Testing with Different Session
We will ask the question using a different
session_idfor this time.
[Note] The last response will be inaccurate because there's no conversation history associated with that session ID redis456.
Last updated