LangChain OpenTutorial
  • 🦜️🔗 The LangChain Open Tutorial for Everyone
  • 01-Basic
    • Getting Started on Windows
    • 02-Getting-Started-Mac
    • OpenAI API Key Generation and Testing Guide
    • LangSmith Tracking Setup
    • Using the OpenAI API (GPT-4o Multimodal)
    • Basic Example: Prompt+Model+OutputParser
    • LCEL Interface
    • Runnable
  • 02-Prompt
    • Prompt Template
    • Few-Shot Templates
    • LangChain Hub
    • Personal Prompts for LangChain
    • Prompt Caching
  • 03-OutputParser
    • PydanticOutputParser
    • PydanticOutputParser
    • CommaSeparatedListOutputParser
    • Structured Output Parser
    • JsonOutputParser
    • PandasDataFrameOutputParser
    • DatetimeOutputParser
    • EnumOutputParser
    • Output Fixing Parser
  • 04-Model
    • Using Various LLM Models
    • Chat Models
    • Caching
    • Caching VLLM
    • Model Serialization
    • Check Token Usage
    • Google Generative AI
    • Huggingface Endpoints
    • HuggingFace Local
    • HuggingFace Pipeline
    • ChatOllama
    • GPT4ALL
    • Video Q&A LLM (Gemini)
  • 05-Memory
    • ConversationBufferMemory
    • ConversationBufferWindowMemory
    • ConversationTokenBufferMemory
    • ConversationEntityMemory
    • ConversationKGMemory
    • ConversationSummaryMemory
    • VectorStoreRetrieverMemory
    • LCEL (Remembering Conversation History): Adding Memory
    • Memory Using SQLite
    • Conversation With History
  • 06-DocumentLoader
    • Document & Document Loader
    • PDF Loader
    • WebBaseLoader
    • CSV Loader
    • Excel File Loading in LangChain
    • Microsoft Word(doc, docx) With Langchain
    • Microsoft PowerPoint
    • TXT Loader
    • JSON
    • Arxiv Loader
    • UpstageDocumentParseLoader
    • LlamaParse
    • HWP (Hangeul) Loader
  • 07-TextSplitter
    • Character Text Splitter
    • 02. RecursiveCharacterTextSplitter
    • Text Splitting Methods in NLP
    • TokenTextSplitter
    • SemanticChunker
    • Split code with Langchain
    • MarkdownHeaderTextSplitter
    • HTMLHeaderTextSplitter
    • RecursiveJsonSplitter
  • 08-Embedding
    • OpenAI Embeddings
    • CacheBackedEmbeddings
    • HuggingFace Embeddings
    • Upstage
    • Ollama Embeddings With Langchain
    • LlamaCpp Embeddings With Langchain
    • GPT4ALL
    • Multimodal Embeddings With Langchain
  • 09-VectorStore
    • Vector Stores
    • Chroma
    • Faiss
    • Pinecone
    • Qdrant
    • Elasticsearch
    • MongoDB Atlas
    • PGVector
    • Neo4j
    • Weaviate
    • Faiss
    • {VectorStore Name}
  • 10-Retriever
    • VectorStore-backed Retriever
    • Contextual Compression Retriever
    • Ensemble Retriever
    • Long Context Reorder
    • Parent Document Retriever
    • MultiQueryRetriever
    • MultiVectorRetriever
    • Self-querying
    • TimeWeightedVectorStoreRetriever
    • TimeWeightedVectorStoreRetriever
    • Kiwi BM25 Retriever
    • Ensemble Retriever with Convex Combination (CC)
  • 11-Reranker
    • Cross Encoder Reranker
    • JinaReranker
    • FlashRank Reranker
  • 12-RAG
    • Understanding the basic structure of RAG
    • RAG Basic WebBaseLoader
    • Exploring RAG in LangChain
    • RAPTOR: Recursive Abstractive Processing for Tree-Organized Retrieval
    • Conversation-With-History
    • Translation
    • Multi Modal RAG
  • 13-LangChain-Expression-Language
    • RunnablePassthrough
    • Inspect Runnables
    • RunnableLambda
    • Routing
    • Runnable Parallel
    • Configure-Runtime-Chain-Components
    • Creating Runnable objects with chain decorator
    • RunnableWithMessageHistory
    • Generator
    • Binding
    • Fallbacks
    • RunnableRetry
    • WithListeners
    • How to stream runnables
  • 14-Chains
    • Summarization
    • SQL
    • Structured Output Chain
    • StructuredDataChat
  • 15-Agent
    • Tools
    • Bind Tools
    • Tool Calling Agent
    • Tool Calling Agent with More LLM Models
    • Iteration-human-in-the-loop
    • Agentic RAG
    • CSV/Excel Analysis Agent
    • Agent-with-Toolkits-File-Management
    • Make Report Using RAG, Web searching, Image generation Agent
    • TwoAgentDebateWithTools
    • React Agent
  • 16-Evaluations
    • Generate synthetic test dataset (with RAGAS)
    • Evaluation using RAGAS
    • HF-Upload
    • LangSmith-Dataset
    • LLM-as-Judge
    • Embedding-based Evaluator(embedding_distance)
    • LangSmith Custom LLM Evaluation
    • Heuristic Evaluation
    • Compare experiment evaluations
    • Summary Evaluators
    • Groundedness Evaluation
    • Pairwise Evaluation
    • LangSmith Repeat Evaluation
    • LangSmith Online Evaluation
    • LangFuse Online Evaluation
  • 17-LangGraph
    • 01-Core-Features
      • Understanding Common Python Syntax Used in LangGraph
      • Title
      • Building a Basic Chatbot with LangGraph
      • Building an Agent with LangGraph
      • Agent with Memory
      • LangGraph Streaming Outputs
      • Human-in-the-loop
      • LangGraph Manual State Update
      • Asking Humans for Help: Customizing State in LangGraph
      • DeleteMessages
      • DeleteMessages
      • LangGraph ToolNode
      • LangGraph ToolNode
      • Branch Creation for Parallel Node Execution
      • Conversation Summaries with LangGraph
      • Conversation Summaries with LangGraph
      • LangGrpah Subgraph
      • How to transform the input and output of a subgraph
      • LangGraph Streaming Mode
      • Errors
      • A Long-Term Memory Agent
    • 02-Structures
      • LangGraph-Building-Graphs
      • Naive RAG
      • Add Groundedness Check
      • Adding a Web Search Module
      • LangGraph-Add-Query-Rewrite
      • Agentic RAG
      • Adaptive RAG
      • Multi-Agent Structures (1)
      • Multi Agent Structures (2)
    • 03-Use-Cases
      • LangGraph Agent Simulation
      • Meta Prompt Generator based on User Requirements
      • CRAG: Corrective RAG
      • Plan-and-Execute
      • Multi Agent Collaboration Network
      • Multi Agent Collaboration Network
      • Multi-Agent Supervisor
      • 08-LangGraph-Hierarchical-Multi-Agent-Teams
      • 08-LangGraph-Hierarchical-Multi-Agent-Teams
      • SQL-Agent
      • 10-LangGraph-Research-Assistant
      • LangGraph Code Assistant
      • Deploy on LangGraph Cloud
      • Tree of Thoughts (ToT)
      • Ollama Deep Researcher (Deepseek-R1)
      • Functional API
      • Reflection in LangGraph
  • 19-Cookbook
    • 01-SQL
      • TextToSQL
      • SpeechToSQL
    • 02-RecommendationSystem
      • ResumeRecommendationReview
    • 03-GraphDB
      • Movie QA System with Graph Database
      • 05-TitanicQASystem
      • Real-Time GraphRAG QA
    • 04-GraphRAG
      • Academic Search System
      • Academic QA System with GraphRAG
    • 05-AIMemoryManagementSystem
      • ConversationMemoryManagementSystem
    • 06-Multimodal
      • Multimodal RAG
      • Shopping QnA
    • 07-Agent
      • 14-MoARAG
      • CoT Based Smart Web Search
      • 16-MultiAgentShoppingMallSystem
      • Agent-Based Dynamic Slot Filling
      • Code Debugging System
      • New Employee Onboarding Chatbot
      • 20-LangGraphStudio-MultiAgent
      • Multi-Agent Scheduler System
    • 08-Serving
      • FastAPI Serving
      • Sending Requests to Remote Graph Server
      • Building a Agent API with LangServe: Integrating Currency Exchange and Trip Planning
    • 08-SyntheticDataset
      • Synthetic Dataset Generation using RAG
    • 09-Monitoring
      • Langfuse Selfhosting
Powered by GitBook
On this page
  • Overview
  • Table of Contents
  • References
  • Environment Setup
  • Creating Tools
  • Binding Tools
  • Binding tools with Parser to Execute
  • Binding tools with Agent and AgentExecutor
  1. 15-Agent

Bind Tools

PreviousToolsNextTool Calling Agent

Last updated 28 days ago

  • Author:

  • Peer Review: ,

  • Proofread :

  • This is a part of

Overview

bind_tools is a powerful function in LangChain for integrating custom tools with LLMs, enabling enriched AI workflows.

This tutorial will show you how to create, bind tools, parse and execute outputs, and integrate them into an AgentExecutor .

Table of Contents

References


Environment Setup

[Note]

  • langchain-opentutorial is a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.

%%capture --no-stderr
%pip install langchain-opentutorial
# Install required packages
from langchain_opentutorial import package

package.install(
    [
        "langsmith",
        "langchain_community",
        "langchain_core",
        "langchain_openai",
    ],
    verbose=False,
    upgrade=False,
)
# Set environment variables
from langchain_opentutorial import set_env

set_env(
    {
        "OPENAI_API_KEY": "",
        "LANGCHAIN_API_KEY": "",
        "LANGCHAIN_TRACING_V2": "true",
        "LANGCHAIN_ENDPOINT": "https://api.smith.langchain.com",
        "LANGCHAIN_PROJECT": "02-Bind-Tools",
    }
)
Environment variables have been set successfully.

You can alternatively set API keys such as OPENAI_API_KEY in a .env file and load them.

[Note] This is not necessary if you've already set the required API keys in previous steps.

# Load API keys from .env file
from dotenv import load_dotenv

load_dotenv(override=True)
True

Creating Tools

Let's define tools for experimentation:

  • get_word_length : Returns the length of a word.

  • add_function : Adds two numbers.

  • bbc_news_crawl : Crawls BBC news and extracts main content.

[Note]

  • Use the @tool decorator for defining tools, and provide clear docstrings.

import requests
from bs4 import BeautifulSoup
from langchain_core.tools import tool


# Define the tools
@tool
def get_word_length(word: str) -> int:
    """Return the length of the given text"""
    return len(word)


@tool
def add_function(a: float, b: float) -> float:
    """Add two numbers together"""
    return a + b


@tool
def bbc_news_crawl(news_url: str) -> str:
    """Crawl a news article from BBC"""
    response = requests.get(news_url)
    if response.status_code == 200:
        soup = BeautifulSoup(response.text, "html.parser")

        # Extract the desired information from the article
        article = soup.find("article")
        if article:
            title = article.find("h1").get_text()  # Extract the title
            content_list = [
                tag.get_text()
                for tag in article.find_all(["h2", "p"])
                if (tag.name == "h2" and "sc-518485e5-0" in tag.get("class", []))
                or (tag.name == "p" and "sc-eb7bd5f6-0" in tag.get("class", []))
            ]  # Extract the content
            content = "\n\n".join(content_list)
    else:
        print(f"HTTP request failed. Response code: {response.status_code}")
    return f"{title}\n\n----------\n\n{content}"


tools = [get_word_length, add_function, bbc_news_crawl]

Binding Tools

Now, let's use the bind_tools function to associate the defined tools with a specific LLM.

from langchain_openai import ChatOpenAI

# Create a model
llm = ChatOpenAI(model="gpt-4o", temperature=0)

# Tool binding
llm_with_tools = llm.bind_tools(tools)

Let's check the results!

The results are stored in tool_calls . Let's print tool_calls .

[Note]

  • name indicates the name of the tool.

  • args contains the arguments that were passed to the tool.

# Execution result
llm_with_tools.invoke(
    "What is the length of the given text 'LangChain OpenTutorial'?"
).tool_calls
[{'name': 'get_word_length',
      'args': {'word': 'LangChain OpenTutorial'},
      'id': 'call_km7ieeNgjOvbPEfPt3bwO4cy',
      'type': 'tool_call'}]

Next, we will connect llm_with_tools with JsonOutputToolsParser to parse tool_calls and review the results.

[Note]

  • type indicates the type of the tool.

  • args contains the arguments that were passed to the tool.

from langchain_core.output_parsers.openai_tools import JsonOutputToolsParser

# Tool Binding + Tool Parser
chain = llm_with_tools | JsonOutputToolsParser(tools=tools)

# Execution Result
tool_call_results = chain.invoke(
    "What is the length of the given text 'LangChain OpenTutorial'?"
)
print(tool_call_results)
[{'args': {'word': 'LangChain OpenTutorial'}, 'type': 'get_word_length'}]
print(tool_call_results)
print("\n==========\n")

# First tool call result
single_result = tool_call_results[0]

print(single_result["type"])
print(single_result["args"])
[{'args': {'word': 'LangChain OpenTutorial'}, 'type': 'get_word_length'}]
    
    ==========
    
    get_word_length
    {'word': 'LangChain OpenTutorial'}

Execute the corresponding tool.

tool_call_results[0]["type"], tools[0].name
('get_word_length', 'get_word_length')

The execute_tool_calls function identifies the appropriate tool, passes the corresponding args , and then executes the tool.

def execute_tool_calls(tool_call_results):
    """
    Function to execute the tool call results.

    :param tool_call_results: List of the tool call results
    :param tools: List of available tools
    """

    # Iterate over the list of the tool call results
    for tool_call_result in tool_call_results:
        # Tool name (function name)
        tool_name = tool_call_result["type"]
        # Tool arguments
        tool_args = tool_call_result["args"]

        # Find the tool that matches the name and execute it
        # Use the next() function to find the first matching tool
        matching_tool = next((tool for tool in tools if tool.name == tool_name), None)
        if matching_tool:
            # Execute the tool
            result = matching_tool.invoke(tool_args)
            print(
                f"[Executed Tool] {tool_name} [Args] {tool_args}\n[Execution Result] {result}"
            )
        else:
            print(f"Warning: Unable to find the tool corresponding to {tool_name}.")


# Execute the tool calls
execute_tool_calls(tool_call_results)
[Executed Tool] get_word_length [Args] {'word': 'LangChain OpenTutorial'}
    [Execution Result] 22

Binding tools with Parser to Execute

This time, we will combine the entire process of binding tools, parsing the results, and executing the tool calls into a single step.

  • llm_with_tools : The LLM model with bound tools.

  • JsonOutputToolsParser : The parser that processes the results of tool calls.

  • execute_tool_calls : The function that executes the results of tool calls.

[Flow Summary]

  1. Bind tools to the model.

  2. Parse the results of tool calls.

  3. Execute the results of tool calls.

from langchain_core.output_parsers.openai_tools import JsonOutputToolsParser

# bind_tools + Parser + Execution
chain = llm_with_tools | JsonOutputToolsParser(tools=tools) | execute_tool_calls
# Execution Result 1
chain.invoke("What is the length of the given text 'LangChain OpenTutorial'?")
[Executed Tool] get_word_length [Args] {'word': 'LangChain OpenTutorial'}
    [Execution Result] 22
# Execution Result 2
chain.invoke("114.5 + 121.2")

# Double check
print(114.5 + 121.2)
[Executed Tool] add_function [Args] {'a': 114.5, 'b': 121.2}
    [Execution Result] 235.7
    235.7
# Execution Result 3
chain.invoke("Crawl the news article: https://www.bbc.com/news/articles/cew52g8p2lko")
[Executed Tool] bbc_news_crawl [Args] {'news_url': 'https://www.bbc.com/news/articles/cew52g8p2lko'}
    [Execution Result] New AI hub 'to create 1,000 jobs' on Merseyside
    
    ----------
    
    A new Artificial Intelligence (AI) hub planned for Merseyside is set to create 1,000 jobs over the next three years, the government said.
    
    Prime Minister Sir Keir Starmer said he wanted to make the UK one of the world's AI "super powers" as a way of boosting economic growth and improving public services.
    
    Global IT company Kyndryl announced it was going to create the new tech hub in the Liverpool City Region.
    
    Metro Mayor Steve Rotheram welcomed the investment, saying it would be "hugely beneficial" to the area.
    
    'International investment'
    
    In a speech setting out the government's AI ambitions, Starmer spoke of its "vast potential" for rejuvenating public services.
    
    The government said its AI Opportunities Action Plan was backed by leading tech firms, some of which have committed £14bn towards various projects including growth zones, creating 13,250 jobs.
    
    Mr Rotheram told BBC Radio Merseyside: "I went over last year to speak to [Kyndryl] face-to-face in New York.
    
    "To have that come to fruition so quickly is hugely beneficial to the workforce in the Liverpool City Region." 
    
    He said attracting the world's largest IT infrastructure services provider was "testament to what we can achieve when local ambition is matched by national support to help attract international investment".
    
    The Labour mayor said the Liverpool City Region was "leading the way in the UK's AI revolution". 
    
    He added: "As a region with a proud history of innovation we're ready to seize the opportunities that AI and digital technology can bring; not just to boost our economy but to improve lives, develop skills, tackle inequality, and ensure no-one is left behind."
    
    The BBC has asked the Department for Science, Innovation and Technology for more details about Merseyside's AI hub plans.
    
    Listen to the best of BBC Radio Merseyside on Sounds and follow BBC Merseyside on Facebook, X, and Instagram and watch BBC North West Tonight on BBC iPlayer.

Binding tools with Agent and AgentExecutor

bind_tools provides schemas (tools) that can be used by the model.

AgentExecutor creates an execution loop for tasks such as invoking the LLM, routing to the appropriate tool, executing it, and re-invoking the model.

[Note]

  • Agent and AgentExecutor will be covered in detail in the next chapter .

from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai import ChatOpenAI

# Create an Agent prompt
prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "You are very powerful assistant, but don't know current events",
        ),
        ("user", "{input}"),
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ]
)

# Create a model
llm = ChatOpenAI(model="gpt-4o", temperature=0)
from langchain.agents import AgentExecutor, create_tool_calling_agent

# Use the tools defined previously
tools = [get_word_length, add_function, bbc_news_crawl]

# Create an Agent
agent = create_tool_calling_agent(llm, tools, prompt)

# Create an AgentExecutor
agent_executor = AgentExecutor(
    agent=agent,
    tools=tools,
    verbose=True,
    handle_parsing_errors=True,
)

Let's calculate the length of a word.

# Execute the Agent
result = agent_executor.invoke(
    {"input": "What is the length of the given text 'LangChain OpenTutorial'?"}
)

# Execution Result
print(result["output"])
    
    > Entering new AgentExecutor chain...
    
    Invoking: `get_word_length` with `{'word': 'LangChain OpenTutorial'}`
    
    
    22The length of the text "LangChain OpenTutorial" is 22 characters.
    
    > Finished chain.
    The length of the text "LangChain OpenTutorial" is 22 characters.

Let's calculate the sum of two numbers.

# Execute the Agent
result = agent_executor.invoke({"input": "Calculate the result of 114.5 + 121.2"})

# Execution Result
print(result["output"])
print("\n==========\n")
print(114.5 + 121.2)
    
    > Entering new AgentExecutor chain...
    
    Invoking: `add_function` with `{'a': 114.5, 'b': 121.2}`
    
    
    235.7The result of 114.5 + 121.2 is 235.7.
    
    > Finished chain.
    The result of 114.5 + 121.2 is 235.7.
    
    ==========
    
    235.7

Let's add more than two numbers.

In this scenario, you can observe that the agent is capable of verifying its own intermediate results and repeating the process if necessary to arrive at the correct final answer.

# Execute the Agent
result = agent_executor.invoke(
    {"input": "Calculate the result of 114.5 + 121.2 + 34.2 + 110.1"}
)

# Execution Result
print(result["output"])
print("\n==========\n")
print(114.5 + 121.2 + 34.2 + 110.1)
    
    > Entering new AgentExecutor chain...
    
    Invoking: `add_function` with `{'a': 114.5, 'b': 121.2}`
    
    
    235.7
    Invoking: `add_function` with `{'a': 235.7, 'b': 34.2}`
    
    
    269.9
    Invoking: `add_function` with `{'a': 34.2, 'b': 110.1}`
    
    
    144.3
    Invoking: `add_function` with `{'a': 269.9, 'b': 110.1}`
    
    
    380.0The result of adding 114.5, 121.2, 34.2, and 110.1 is 380.0.
    
    > Finished chain.
    The result of adding 114.5, 121.2, 34.2, and 110.1 is 380.0.
    
    ==========
    
    380.0

Finally, let's try using a tool to summarize a news article.

# Execute the Agent
result = agent_executor.invoke(
    {
        "input": "Summarize the news article: https://www.bbc.com/news/articles/cew52g8p2lko"
    }
)

# Execution Result
print(result["output"])
    
    > Entering new AgentExecutor chain...
    
    Invoking: `bbc_news_crawl` with `{'news_url': 'https://www.bbc.com/news/articles/cew52g8p2lko'}`
    
    
    New AI hub 'to create 1,000 jobs' on Merseyside
    
    ----------
    
    A new Artificial Intelligence (AI) hub planned for Merseyside is set to create 1,000 jobs over the next three years, the government said.
    
    Prime Minister Sir Keir Starmer said he wanted to make the UK one of the world's AI "super powers" as a way of boosting economic growth and improving public services.
    
    Global IT company Kyndryl announced it was going to create the new tech hub in the Liverpool City Region.
    
    Metro Mayor Steve Rotheram welcomed the investment, saying it would be "hugely beneficial" to the area.
    
    'International investment'
    
    In a speech setting out the government's AI ambitions, Starmer spoke of its "vast potential" for rejuvenating public services.
    
    The government said its AI Opportunities Action Plan was backed by leading tech firms, some of which have committed £14bn towards various projects including growth zones, creating 13,250 jobs.
    
    Mr Rotheram told BBC Radio Merseyside: "I went over last year to speak to [Kyndryl] face-to-face in New York.
    
    "To have that come to fruition so quickly is hugely beneficial to the workforce in the Liverpool City Region." 
    
    He said attracting the world's largest IT infrastructure services provider was "testament to what we can achieve when local ambition is matched by national support to help attract international investment".
    
    The Labour mayor said the Liverpool City Region was "leading the way in the UK's AI revolution". 
    
    He added: "As a region with a proud history of innovation we're ready to seize the opportunities that AI and digital technology can bring; not just to boost our economy but to improve lives, develop skills, tackle inequality, and ensure no-one is left behind."
    
    The BBC has asked the Department for Science, Innovation and Technology for more details about Merseyside's AI hub plans.
    
    Listen to the best of BBC Radio Merseyside on Sounds and follow BBC Merseyside on Facebook, X, and Instagram and watch BBC North West Tonight on BBC iPlayer.A new Artificial Intelligence (AI) hub is planned for Merseyside, expected to create 1,000 jobs over the next three years. Prime Minister Sir Keir Starmer aims to position the UK as a global AI "superpower" to boost economic growth and improve public services. The global IT company Kyndryl will establish the tech hub in the Liverpool City Region. Metro Mayor Steve Rotheram praised the investment, highlighting its benefits for the area. The government's AI Opportunities Action Plan, supported by leading tech firms, has secured £14 billion for various projects, including growth zones, creating 13,250 jobs. Rotheram emphasized the region's leadership in the UK's AI revolution and its readiness to leverage AI and digital technology for economic and social benefits.
    
    > Finished chain.
    A new Artificial Intelligence (AI) hub is planned for Merseyside, expected to create 1,000 jobs over the next three years. Prime Minister Sir Keir Starmer aims to position the UK as a global AI "superpower" to boost economic growth and improve public services. The global IT company Kyndryl will establish the tech hub in the Liverpool City Region. Metro Mayor Steve Rotheram praised the investment, highlighting its benefits for the area. The government's AI Opportunities Action Plan, supported by leading tech firms, has secured £14 billion for various projects, including growth zones, creating 13,250 jobs. Rotheram emphasized the region's leadership in the UK's AI revolution and its readiness to leverage AI and digital technology for economic and social benefits.

Set up the environment. You may refer to for more details.

You can checkout the for more details.

Conceptual guide - Tool calling
tool_calls
AgentExecutor
Environment Setup
langchain-opentutorial
Jaemin Hong
Hye-yoon Jeong
JoonHo Kim
Chaeyoon Kim
LangChain Open Tutorial
Overview
Environment Setup
Creating Tools
Binding Tools
Binding tools with Parser to Execute
Binding tools with Agent and AgentExecutor