Title

Open in Colab Open in GitHub

Overview

This tutorial covers how to create a simple chatbot using LangGraph.

LangGraph is an open-source framework for building and managing AI agents and multi-agent systems. It simplifies state management, agent interactions, and error handling, enabling developers to create robust applications powered by Large Language Models (LLMs).

LangGraph enables agentic applications by defining three core components:

  • Nodes: Individual computation steps or functions.

  • States: Context or memory maintained during computations.

  • Edges: Connections between nodes, guiding the flow of computation.

In this tutorial, We’ll build a simple chatbot using LangGraph. The chatbot will respond directly to user messages. To start, we’ll create a StateGraph, which defines the chatbot’s structure as a state machine

Table of Contents

References

Environment Setup

Setting up your environment is the first step. See the Environment Setup guide for more details.

[Note]

The langchain-opentutorial is a package of easy-to-use environment setup guidance, useful functions and utilities for tutorials. Check out the langchain-opentutorial for more details.

You can set API keys in a .env file or set them manually.

[Note] If you’re not using the .env file, no worries! Just enter the keys directly in the cell below, and you’re good to go.

If you want to use Groq api you can get it from here GROQ_API_KEY or you can also use Openai api.

Let's setup ChatGroq with Gemma2-9b-It model.

Also you can use ChatOpenAI with gpt-4o model comment the last cell and uncomment the below cell.

Lets Create a Graph

Key Components:

  1. State: A TypedDict that holds the chatbot's state, specifically a list of messages.

  2. StateGraph: The core structure that defines the chatbot's behavior as a state machine.

Code Implementation:

Explanation:

  • State: The State class defines the chatbot's memory, which is a list of messages (messages). The Annotated type hints that add_messages will handle message updates.

  • StateGraph: The StateGraph object (graph_builder) will manage the chatbot's state transitions and logic.

Implementing the Chatbot Logic

The chatbot function defines the core behavior of the chatbot. It takes the current state (which contains the conversation history) and generates a response using a Large Language Model (LLM).

Key Points:

  • Input: The state object contains the conversation history (messages).

  • Output: The function returns an updated state with the chatbot's response appended to the messages list.

  • LLM Integration: The llm.invoke() method is used to generate a response based on the conversation history.

Code Implementation:

Explanation:

  • state: State: The function accepts the current state, which includes the list of messages.

  • llm.invoke(state['messages']): The LLM processes the conversation history and generates a response.

  • Return Value: The function returns a dictionary with the updated messages list, including the chatbot's response.

Building the Chatbot State Machine

Now that we've defined the chatbot function, we'll integrate it into the StateGraph by adding nodes and edges. This defines the flow of the chatbot's state machine.

Key Steps:

  1. Add Node: Register the chatbot function as a node in the graph.

  2. Add Edges: Define the flow of the state machine:

    • From START to the chatbot node.

    • From the chatbot node to END.

  3. Compile the Graph: Finalize the graph structure so it can be executed.

Code Implementation:

Explanation:

  • add_node("chatbot", chatbot): Registers the chatbot function as a node named "chatbot".

  • add_edge(START, "chatbot"): Specifies that the conversation starts with the chatbot node.

  • add_edge("chatbot", END): Specifies that the conversation ends after the chatbot responds.

  • compile(): Finalizes the graph structure, making it ready for execution.

Visualizing the Chatbot State Machine

To better understand the structure of the chatbot's state machine, we can visualize the graph using Mermaid diagrams. This step is optional but highly useful for debugging and understanding the flow of the application.

Key Points:

  • Mermaid Diagram: The graph.get_graph().draw_mermaid_png() method generates a visual representation of the graph.

  • IPython Display: The IPython.display module is used to render the diagram directly in the notebook.

Code Implementation:

png

Explanation:

  • graph.get_graph().draw_mermaid_png(): Generates a Mermaid diagram of the graph and converts it to a PNG image.

  • display(Image(...)): Renders the PNG image in the notebook.

  • try-except Block: Ensures the code doesn't break if visualization fails (e.g., due to missing dependencies or unsupported environments).

Running the Chatbot

This section implements the chatbot's interaction loop, where the user can input messages, and the chatbot responds. The loop continues until the user types quit or q.

Key Features:

  • User Input: The chatbot listens for user input and processes it using the compiled graph.

  • Streaming Responses: The graph.stream() method processes the input and generates responses in real-time.

  • Exit Condition: The loop exits when the user types quit or q.

Code Implementation:

Explanation:

  • while True: Creates an infinite loop for continuous interaction.

  • input("User: "): Prompts the user for input.

  • graph.stream({'messages': ("user", user_input)}): Processes the user input through the graph and streams the results.

  • event.values(): Extracts the chatbot's response from the event.

  • value['messages'].content: Displays the chatbot's response.

Last updated