Langchain Applications — Part 4— PromptValues and PromptTemplates

Shishir Singh
3 min readJun 19, 2023

--

This article is part of a series explaining Langchain applications with simple Python code examples using OpenAI. In Part 3 and earlier, we discussed Models Applications.

Langchain Concepts — Part 4 introduced Prompts & Prompt Templates. In this article, we will look deeper into PromptValue, and PromptTemplates applications.

Prompts form a central part of LangChain’s functionality. They consist of PromptValue, PromptTemplates, Example Selectors, and Output Parsers. The PromptValue represents an input to a model, while PromptTemplates are responsible for constructing this input.

PromptValue: Data Abstraction for Model Inputs

In LangChain’s architecture, the PromptValue class plays a central role as a data abstraction layer that represents the input to a language model. The primary data type for model interactions is text, but ongoing development aims to support a wider array of data types, including images and audio. Acknowledging the diverse data formats models may require, PromptValue providing methods to convert data to the specific input types each model expects - text or ChatMessages for current implementations.

Prompt Templates: Dynamic Prompt Construction

PromptTemplate objects in LangChain are designed to create PromptValue dynamically. The PromptValue passed to the model is not hard-coded. It is generated based on user input, variable information from multiple sources, and a fixed template string. The PromptTemplate object exposes a method that accepts input variables and returns a PromptValue, embracing the principles of dynamic programming and user-centric interactivity.

Trekking Trip to Colorado

Let's look at their usage using our favorite real-world example.

PromptsValue and PromptTemplates with LLM:

prompt = PromptTemplate(template=template, input_variables=[“question”])

from langchain import PromptTemplate, LLMChain
from langchain.llms import OpenAI
llm = OpenAI(model_name="gpt-3.5-turbo",temperature=0.7, openai_api_key=openai_api_key)
template = """Question: {question} {dest1} {dest2}

Answer: Let's think step by step."""

prompt = PromptTemplate(template=template, input_variables=["question","dest1","dest2"])
print(prompt.input_variables)
llm_chain = LLMChain(prompt=prompt, llm=llm)
question = "what are different ways to travel between these two destinations?"
dest1= "Staten Island"
dest2 = "Manhattan"
llm_chain.run(question=question, dest1=dest1, dest2=dest2)
prompt = PromptTemplate.from_template(template)
llm(prompt.format(question=question, dest1=dest1, dest2=dest2))

We import 3 modules- PromptTemplate, LLMChain, and OpenAI. We then use two different methods to answer a dynamic question, asking LLM to think step-by-step.

In the first method, Prompt, “Question” is hardcoded, and its value is provided dynamically, in the second method a PromptTemplate is created using the from_template method.

Prompts and Prompt Templates with chatModels:

Next, we make interesting use of PromptTemplates in a ChatModel, and develop a conversation. We import the ChatOpenAI module and prompts-related templates. We construct the 3 templates — System, User, and AI.

from langchain.chat_models import ChatOpenAI
from langchain.prompts import (
ChatPromptTemplate,
PromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.schema import (
AIMessage,
HumanMessage,
SystemMessage
)
chat = ChatOpenAI(model_name="gpt-3.5-turbo",temperature=.9, openai_api_key=openai_api_key)
system_template="You are an expert adviser on Hiking Activities in the{place}"
system_message_prompt = SystemMessagePromptTemplate.from_template(system_template)
human_template="{request}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
ai_template="{response}"
ai_message_prompt = AIMessagePromptTemplate.from_template(ai_template)
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt,ai_message_prompt, human_message_prompt])
llm_chain = LLMChain(prompt=chat_prompt, llm=chat)
request="I am planning a hiking trip to Colarado. Please generate a bulleted checklist of 3 top items to prepare for this trip"
response1 = llm_chain.run(place="USA", response="", request=request)
request="Please provide specific additional relevant questions that I should be asking not covered in your previous response and dont give reason?"
response2 = llm_chain.run(place="USA", response=response1, request=request)
request="Why should I be asking those questions?"
response3 = llm_chain.run(place="USA", response=response1 + response2, request=request)
request="Include these questions and regenerate the original checklist"
response4 = llm_chain.run(place="USA", response=response1+response2+response3, request=request)

Conclusion

In conclusion, the architecture of LangChain demonstrates a highly dynamic and user-friendly approach to programming language models. Central to this is the concept of the PromptValue class, which serves as an abstraction layer representing the input to a language model, and the PromptTemplate, which is responsible for constructing a PromptValue based on variable information from multiple sources and a fixed template string. Together, they allow for the construction of queries that are rarely hard-coded, but instead often assembled from several components, reflecting the principles of dynamic programming and user-centric interactivity.

The examples demonstrated highlight the flexibility and utility of these classes in both LLMs and chat models. Whether providing a dynamic approach to answering a step-by-step question or simulating a conversation with an AI on hiking advice, the PromptValue and PromptTemplate classes prove to be versatile tools for interaction. They enable the user to dictate the flow of interaction by providing dynamic inputs and control over the output’s context.

In the next set of articles, we will cover other key concepts, including Part-5 — Example Selector & Parsers, Indexes, Memory, Chains, and Agents.

GitHub Python code after the conclusion of the series.

--

--

Shishir Singh

Digital Assets, Blockchains, DLTs, Tokenization & Protocols & AI Intersection