Prompt Template

Open in ColabOpen in GitHub

Overview

This tutorial covers how to create and utilize prompt templates using LangChain.

Prompt templates are essential for generating dynamic and flexible prompts that cater to various use cases, such as conversation history, structured outputs, and specialized queries.

In this tutorial, we will explore methods for creating PromptTemplate objects, applying partial variables, managing templates through YAML files, and leveraging advanced tools like ChatPromptTemplate and MessagePlaceholder for enhanced functionality.

Table of Contents

References


Environment Setup

Set up the environment. You may refer to Environment Setup for more details.

[Note]

  • langchain-opentutorial is a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.

  • You can check out the langchain-opentutorial for more details.

Let's setup ChatOpenAI with gpt-4o model.

Creating a PromptTemplate Object

There are two ways to create a PromptTemplate object.

    1. Using the from_template() method

    1. Creating a PromptTemplate object and a prompt all at once

Method 1. Using the from_template() method

  • Define template with variable as {variable} .

You can complete the prompt by assigning a value to the variable country .

Method 2. Creating a PromptTemplate object and a prompt all at once

Explicitly specify input_variables for additional validation.

Otherwise, a mismatch between such variables and the variables within the template string can raise an exception in instantiation.

Using partial_variables

Using partial_variables , you can partially apply functions. This is particularly useful when there are common variables to be shared.

Common examples are date or time.

Suppose you want to specify the current date in your prompt, hardcoding the date into the prompt or passing it along with other input variables may not be practical. In this case, using a function that returns the current date to modify the prompt partially is much more convenient.

Load Prompt Templates from YAML Files

You can manage prompt templates in seperate yaml files and load using load_prompt .

ChatPromptTemplate

ChatPromptTemplate can be used to include a conversation history as a prompt.

Messages are structured as tuples in the format (role , message ) and are created as a list.

role

  • system : A system setup message, typically used for global settings-related prompts.

  • human : A user input message.

  • ai : An AI response message.

You can directly invoke LLM using the messages created above.

You can also create a chain to execute.

MessagePlaceholder

LangChain also provides a MessagePlaceholder , which provides complete control over rendering messages during formatting.

This can be useful if you’re unsure which roles to use in a message prompt template or if you want to insert a list of messages during formatting.

You can use MessagesPlaceholder to add the conversation message list.

Last updated