Conversation-With-History
Author: Sunworl Kim
Design:
Peer Review: Yun Eun
Proofread : Yun Eun
This is a part of LangChain Open Tutorial
Overview
This tutorial provides a comprehensive guide to implementing conversational AI systems with memory capabilities using LangChain in two main approaches.
1. Creating a chain to record conversations
Creates a simple question-answering chatbot using
ChatOpenAI.Implements a system to store and retrieve conversation history based on session IDs.
Uses
RunnableWithMessageHistoryto incorporate chat history into the chain.
2. Creating a RAG chain that retrieves information from documents and records conversations
Builds a more complex system that combines document retrieval with conversational AI.
Processes a PDF document , creates embeddings, and sets up a vector store for efficient retrieval.
Implements a RAG chain that can answer questions based on the document content and previous conversation history.
Table of Contents
References
Environment Setup
Set up the environment. You may refer to Environment Setup for more details.
[Note]
langchain-opentutorialis a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.You can checkout the
langchain-opentutorialfor more details.
You can alternatively set API keys such as OPENAI_API_KEY in a .env file and load them.
[Note] This is not necessary if you've already set the required API keys in previous steps.
Creating a Chain that remembers previous conversations
Background knowledge needed to understand this content : RunnableWithMessageHistory
1. Adding Chat History to the Core Chain
Implement
MessagesPlaceholderto incorporate conversation historyDefine a prompt template that handles user input queries
Initialize a
ChatOpenAIinstance configured to use the ChatGPT modelConstruct a chain by connecting the prompt template, language model, and output parser
Implement
StrOutputParserto format the model's response as a string
Creating a Chain with Conversation History (chain_with_history)
Initialize a dictionary to store conversation session records
Create the function
get_session_historythat retrieves chat history by session ID and creates a newChatMessageHistoryinstance if none existsInstantiate a
RunnableWithMessageHistoryobject to handle persistent conversation history
Process the initial input.
Handle Subsequent Query.
2. Implementing RAG with Conversation History Management
Build a PDF-based Question Answering system that incorporates conversational context.
Create a standard RAG Chain, ensuring to include {chat_history} in the prompt template at step 6.
(step 1) Load PDF documents using
PDFPlumberLoader(step 2) Segment documents into manageable chunks with
RecursiveCharacterTextSplitter(step 3) Create vector embeddings of text chunks using
OpenAIEmbeddings(step 4) Index and store embeddings in a
FAISSvector database(step 5) Implement a
retrieverto query relevant information from the vector database(step 6) Design a QA prompt template that incorporates conversation history , user queries, and retrieved context with response instructions
(step 7) Initialize a
ChatOpenAIinstance configured to use theGPT-4omodel(step 8) Build the complete chain by connecting the retriever, prompt template, and language model
The system retrieves relevant document context for user queries and generates contextually informed responses.
Implementing Conversation History Management
Initialize the
storedictionary to maintain conversation histories indexed by session IDs, and create theget_session_historyfunction to retrieve or create session recordsCreate a
RunnableWithMessageHistoryinstance to enhance the RAG chain with conversation tracking capabilities, handling both user queries and historical context
Process the first user input.
Execute the subsequent question.
Last updated