ConversationEntityMemory

Open in ColabOpen in GitHub

Overview

ConversationEntityMemory allows the conversation system to retain facts about specific entities mentioned during the dialogue.

It extracts information about entities from the conversation (using an LLM) and accumulates knowledge about these entities over time (also using an LLM)

Table of Contents

References


Environment Setup

Set up the environment. You may refer to Environment Setup for more details.

[Note]

  • langchain-opentutorial is a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.

  • You can checkout the langchain-opentutorial for more details.

You can alternatively set OPENAI_API_KEY in .env file and load it.

[Note] This is not necessary if you've already set OPENAI_API_KEY in previous steps.

Entity Memory Conversation Example

This example demonstrates how to use ConversationEntityMemory to store and manage information about entities mentioned during a conversation. The conversation accumulates ongoing knowledge about these entities while maintaining a natural flow.

Retrieving Entity Memory

Let's examine the conversation history stored in memory using the memory.entity_store.store method to verify memory retention.

Last updated