20-LangGraphStudio-MultiAgent
LangGraphStudio - MultiAgent
Author: Taylor(Jihyun Kim)
Peer Review:
Proofread : Q0211
This is a part of LangChain Open Tutorial
Overview
This notebook demonstrates how to build a Multi-agent workflow by integrating LangChain with LangGraph Studio, allowing you to orchestrate multiple specialized agents for gathering, analyzing, and synthesizing information. In this tutorial, we focus on researching a specific person, their professional background, and the company they work for, as well as generating relevant follow-up questions or interview prompts.
By visualizing this agent workflow in LangGraph Studio, you can easily debug, modify, and extend the pipeline. Each agentβs output can be inspected step by step, making it straightforward to add new components or adjust the process flow.

Table of Contents
References
Environment Setup
Set up the environment. You may refer to Environment Setup for more details.
[Note]
langchain-opentutorialis a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.You can checkout the
langchain-opentutorialfor more details.
You can alternatively set API keys such as ANTHROPIC_API_KEY in a .env file and load them.
[Note] This is not necessary if you've already set the required API keys in previous steps.
What is LangGraph Studio
LangGraph Studio offers a new way to develop LLM applications by providing a specialized agent IDE that enables visualization, interaction, and debugging of complex agentic applications.
With visual graphs and the ability to edit the state, you can better understand agent workflows and iterate faster. LangGraph Studio integrates with LangSmith so you can collaborate with teammates to debug failure modes.
To use LangGraph Studio, make sure you have a project with a LangGraph app set up.
The desktop application only supports macOS. Other users can run a local LangGraph server and use the web studio.
We also depend on Docker Engine to be running, currently we only support the following runtimes:
LangGraph Studio requires Docker-compose version 2.22.0+ or higher.
Please make sure you have Docker Desktop or Orbstack installed and running before continuing.
In this tutorial, we have installed and are using Docker Desktop as our container runtime environment.

Building a Multi-Agent Workflow
Our system implements a sophisticated multi-agent workflow, organized into four main categories:
1. Personal Information Research π€
Query Generator (
generate_queries)Role: Generates search queries based on personal information (name, email, company)
Output: Set of optimized search queries
Personal Researcher (
research_person)Role: Performs web searches using generated queries
Output: Summary of key information about the target person
2. Project Analysis π
Project Query Generator (
extract_project_queries)Role: Analyzes personal research notes to identify project-related queries
Output: Project-focused search queries
Project Researcher (
research_projects)Role: Collects and analyzes project information
Output: Detailed project information and insights
3. Company Research π’
Company Query Generator (
generate_queries_for_company)Role: Creates customized search queries for gathering company information
Output: Company-related optimized search queries
Company Researcher (
research_company)Role: Gathers company background and context information
Output: Comprehensive company profile
4. Integration & Analysis π
Information Integrator (
combine_notes)Role: Integrates all research results (personal, projects, company)
Output: Consolidated comprehensive report
Question Generator (
generate_questions)Role: Generates interview questions based on integrated data
Output: Set of customized interview questions
Quality Controller (
reflection)Role: Reviews data completeness and identifies areas for improvement
Output: Quality report and additional research needs

Jupyter Notebook Code Cell Extractor
This script converts Jupyter Notebook cells into a Python script with the following features:
Converts pip install magic commands into executable Python code
Removes or comments out visualization-related code
Handles cell deduplication
Processes cells up to the graph compilation
Maintains code organization and readability
This conversion is necessary because LangGraph Studio requires Python (.py) files for execution. This script helps transform our tutorial notebook into the correct format while maintaining all functionality.
Key Features:
Automatic package installation code generation
Cell content deduplication
Selective cell processing
Magic command handling
Proper formatting for
LangGraph Studiocompatibility
How to connect a local agent to LangGraph Studio
Connection Options There are two ways to connect your local agent to LangGraph Studio:
Development Server: Python package, all platforms,
no DockerLangGraph Desktop:
Application,Mac only,requires Docker
In this guide we will cover how to use the development server as that is generally an easier and better experience.
LangGraph Studio Desktop (Beta)
Currently, the desktop application only supports only macOS. Other users can run a local LangGraph server and use the web studio. We also depend on Docker Engine to be running. Currently, we support only the following runtimes:
LangGraph Studio Download for MacOS
Setup your application
First, you will need to setup your application in the proper format. This means defining a langgraph.json file which contains paths to your agent(s). See this guide for information on how to do so.
Please make sure that all the required files for running LangGraph Studio are located in the langgraph_studio folder.
For this example, we will use this example repository here which uses a requirements.txt file for dependencies:
As previously mentioned, we are using Docker Desktop , so please download it, launch the app, and make sure the Docker engine is running. Then, in LangGraph Studio , open the langgraph_studio folder.

After a short while, once the build completes successfully, you will see a screen similar to the one below.
Now, letβs run a test. Iβll enter my actual company email address.
Demo
Here is a demo video demonstrating how it works in practice.
LangGraph Studio Demo Video Link
Output
Here are relevant interview questions based on the provided notes:
Technical Experience & Skills:
Q1: Could you describe your transition from data analysis at Tiffany & Co. to AI engineering, and how your previous experience informs your current work with LLMs and RAG systems?
Q2: What specific NLP challenges have you encountered while developing the "Ticki tacka" project control system, and how have you addressed them?
Q3: How do you balance your current studies in Statistics and Data Science with your role as an AI Engineer? What aspects of your coursework directly apply to your work?
Project & Product Specific:
Q4: Could you walk us through the core AI components of the "Ticki tacka" system and your role in its development?
Q5: What metrics or KPIs have you established to measure the effectiveness of the AI-powered workplace optimization solutions you're developing?
Language & Communication:
Q6: Given your Chinese and English language proficiencies, how do you leverage these skills in your current role, particularly in technical documentation or team collaboration?
Company Growth & Vision:
Q7: How has the recent seed funding and TIPS grant influenced your team's approach to AI development and project priorities?
Q8: What role do you see AI playing in workplace happiness and employee engagement, and how does this align with Reversemountain's mission?
Technical Implementation:
Q9: Could you describe your experience implementing RAG systems, and what challenges have you encountered in enterprise applications?
Q10: How do you approach the balance between model performance and practical business requirements in your AI solutions?
Wow, these interview questions are really well-tailored based on my past and current companies!
Iβd better make sure I donβt get caught off guard if they actually come up in an interview. π€£
Last updated