EnumOutputParser
Author: ranian963
Peer Review : JaeHo Kim
Proofread : Two-Jay
This is a part of LangChain Open Tutorial
Overview
In this tutorial, we introduce how to use EnumOutputParser to extract valid Enum values from the output of an LLM.
EnumOutputParser is a tool that parses the output of a language model into one of the predefined enumeration (Enum) values , offering the following features:
Enumeration Parsing: Converts the string output into a predefined
Enumvalue.Type Safety: Ensures that the parsed result is always one of the defined
Enumvalues.Flexibility: Automatically handles spaces and line breaks.
Table of Contents
References
Environment Setup
Set up the environment. You may refer to Environment Setup for more details.
[Note]
langchain-opentutorialis a package that provides a set of easy-to-use environment setup, useful functions and utilities for tutorials.You can checkout the
langchain-opentutorialfor more details.
You can alternatively set OPENAI_API_KEY in .env file and load it.
[Note] This is not necessary if you've already set OPENAI_API_KEY in previous steps.
Introduction to EnumOutputParser
EnumOutputParser is a tool that strictly parses an LLM's output into a defined enumeration (Enum).
This ensures that the model output is always one of the enumerated values.
Use cases
When you only want one valid choice from a set of possibilities.
When you want to avoid typos and variations by using a clear Enum value.
In the following example, we define an Colors Enum and make the LLM return one of red/green/blue by parsing the output.
Example: Colors Enum Parser
The code below shows how to define the Colors(Enum) class and wrap it with EnumOutputParser, then integrate it into a prompt chain.
Once the chain is executed, the LLM response is strictly parsed into one of the values in Colors.
Define the Colors enumeration using the Enum class from Python's built-in enum module.
Now we create an EnumOutputParser object for parsing strings into the Colors enumeration.
Below is an example that constructs a simple chain using PromptTemplate and ChatOpenAI.
If the LLM responds about which color the object "sky" is, the parser will convert that string into a Colors value.
Now let's run the chain.
Last updated