This tutorial covers a scenario where you need to pass constant arguments(not included in the output of the previous Runnable or user input) when calling a Runnable inside a Runnable sequence. In such cases, Runnable.bind() is a convenient way to pass these arguments
Table of Contents
References
Environment Setup
[Note]
langchain-opentutorial is a package that provides a set of easy-to-use environment setup tools, useful functions and utilities for tutorials.
# Set environment variables
from langchain_opentutorial import set_env
set_env(
{
"OPENAI_API_KEY": "",
"LANGCHAIN_API_KEY": "",
"LANGCHAIN_TRACING_V2": "true",
"LANGCHAIN_ENDPOINT": "https://api.smith.langchain.com",
"LANGCHAIN_PROJECT": "Binding", # title
}
)
Environment variables have been set successfully.
You can alternatively set OPENAI_API_KEY in .env file and load it.
[Note] This is not necessary if you've already set OPENAI_API_KEY in previous steps.
# Configuration File for Managing API Keys as Environment Variables
from dotenv import load_dotenv
# Load API Key Information
load_dotenv(override=True)
True
Runtime Arguments Binding
This section explains how to use Runnable.bind() to pass constant arguments to a Runnable within a sequence, especially when those arguments aren't part of the previous Runnable's output or user input.
Passing variables to prompts:
Use RunnablePassthrough to pass the {equation_statement} variable to the prompt.
Use StrOutputParser to parse the model's output into a string, creating a runnable object.
Call the runnable.invoke() method to pass the equation statement (e.g., "x raised to the third plus seven equals 12") and get the result.
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_openai import ChatOpenAI
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
# Write the following equation using algebraic symbols and then solve it.
"Write out the following equation using algebraic symbols then solve it. "
"Please avoid LaTeX-style formatting and use plain symbols."
"Use the format:\n\nEQUATION:...\nSOLUTION:...\n",
),
(
"human",
"{equation_statement}", # Accepts the equation statement from the user as a variable.
),
]
)
# Initialize the ChatOpenAI model and set temperature to 0.
model = ChatOpenAI(model="gpt-4o", temperature=0)
# Pass the equation statement to the prompt and parse the model's output as a string.
runnable = (
{"equation_statement": RunnablePassthrough()} | prompt | model | StrOutputParser()
)
# Input an example equation statement and print the result.
result = runnable.invoke("x raised to the third plus seven equals 12")
print(result)
EQUATION: x^3 + 7 = 12
SOLUTION:
1. Subtract 7 from both sides of the equation to isolate the x^3 term:
x^3 + 7 - 7 = 12 - 7
x^3 = 5
2. Take the cube root of both sides to solve for x:
x = 5^(1/3)
Therefore, the solution is:
x ≈ 1.71 (rounded to two decimal places)
Using bind() method with stop words
For controlling the end of the model's output using a specific stop word, you can use model.bind() to instruct the model to halt its generation upon encountering the stop token such as SOLUTION.
runnable = (
# Create a runnable passthrough object and assign it to the "equation_statement" key.
{"equation_statement": RunnablePassthrough()}
| prompt # Add the prompt to the pipeline.
| model.bind(
stop="SOLUTION"
) # Bind the model and set it to stop generating at the "SOLUTION" token.
| StrOutputParser() # Add the string output parser to the pipeline.
)
# Execute the pipeline with the input "x raised to the third plus seven equals 12" and print the result.
print(runnable.invoke("x raised to the third plus seven equals 12"))
EQUATION: x^3 + 7 = 12
Connecting OpenAI Functions
bind() is particularly useful for connecting OpenAI Functions with compatible OpenAI models.
Let's define openai_function according to a schema.
openai_function = {
"name": "solver", # Function name
# Function description: Formulate and solve an equation.
"description": "Formulates and solves an equation",
"parameters": { # Function parameters
"type": "object", # Parameter type: object
"properties": { # Parameter properties
"equation": { # Equation property
"type": "string", # Type: string
"description": "The algebraic expression of the equation", # Description
},
"solution": { # Solution property
"type": "string", # Type: string
"description": "The solution to the equation", # Description
},
},
"required": [
"equation",
"solution",
], # Required parameters: equation and solution
},
}
Binding the solver function.
We can then use the bind() method to associate a function call (like solver) with the language model.
# Write the following equation using algebraic symbols and then solve it
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"Write out the following equation using algebraic symbols then solve it.",
),
("human", "{equation_statement}"),
]
)
model = ChatOpenAI(model="gpt-4o", temperature=0).bind(
function_call={"name": "solver"}, # Bind the OpenAI function schema
functions=[openai_function],
)
runnable = {"equation_statement": RunnablePassthrough()} | prompt | model
# Equation: x raised to the third plus seven equals 12
runnable.invoke("x raised to the third plus seven equals 12")
This section explains how to connect and use OpenAI tools within your LangChain applications.
The tools object simplifies using various OpenAI features.
For example, calling the tool.run method with a natural language query allows the model to utilize the specified tool to generate a response.
tools = [
{
"type": "function",
"function": {
"name": "get_current_weather", # Function name to get current weather
"description": "Fetches the current weather for a given location", # Description
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and state, e.g.: San Francisco, CA", # Location description
},
# Temperature unit
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"], # Required parameter: location
},
},
}
]
Binding tools and invoking the model:
Use bind() to associate tools with the language model.
Call the invoke() method on the bound model, by providing a natural language question as input.
# Initialize the ChatOpenAI model and bind the tools.
model = ChatOpenAI(model="gpt-4o").bind(tools=tools)
# Invoke the model to ask about the weather in San Francisco, New York, and Los Angeles.
model.invoke(
"Can you tell me the current weather in San Francisco, New York, and Los Angeles?"
)