Basic Example: Prompt+Model+OutputParser
Author: ChangJun Lee
Peer Review: Erika Park, Wooseok Jeong
Proofread : Q0211
This is a part of LangChain Open Tutorial
Overview
The most fundamental and commonly used case involves linking a prompt template with a model. To illustrate how this works, let us create a chain that asks for the capital cities of various countries.
Table of Contents
References
Environment Setup
Set up the environment. You may refer to Environment Setup for more details.
[Note]
langchain-opentutorialis a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.You can checkout the
langchain-opentutorialfor more details.
You can alternatively set OPENAI_API_KEY in .env file and load it.
[Note] This is not necessary if you've already set OPENAI_API_KEY in previous steps.
Utilizing Prompt Templates
PromptTemplate
A prompt template is used to create a complete prompt string by incorporating the user's input variables.
Usage
template: A template string is a predefined format where curly braces '{}' are used to represent variables.input_variables: The names of the variables to be inserted within the curly braces are defined as a list.
input_variables
input_variablesis a list that defines the names of the variables used in thePromptTemplate.
The from_template() method is used to create a PromptTemplate object.
Chain Creation
LCEL (LangChain Expression Language)
Here, we use LCEL to combine various components into a single chain.

The | symbol works similarly to the Unix pipe operator, linking different components and passing the output of one component as the input to the next.
In this chain, user input is passed to the prompt template, and the output from the prompt template is then forwarded to the model. By examining each component individually, you can understand what happens at each step.
Calling invoke()
invoke()Input values are provided in the form of a Python dictionary (key-value pairs).
When calling the
invoke()function, these input values are passed as arguments.
Below is an example of outputting a streaming response:
Output Parser
An Output Parser is a tool designed to transform or process the responses from an AI model into a specific format. Since the model's output is typically provided as free-form text, an Output Parser is essential to convert it into a structured format or extract the required data.
An output parser is added to the chain.
Applying and Modifying Templates
The prompt content below can be modified as needed for testing purposes.
The
model_namecan also be adjusted for testing.
Last updated