Introduction
In this article, we will cover Agents and their types in LangChain. Before going deep into Agents, let's understand what exactly LangChain and Agents are.
What is LangChain?
Langchain is a powerful tool that can be used to automate a variety of tasks. It provides a number of tools that can be used to build agents.
- Agents: An agent is a software program that can interact with the real world. Langchain provides a number of different types of agents.
- Tools: Langchain provides a number of tools that can be used to build agents.
For more detail on LangChain, read the article- Getting Started With LangChainββββ.
What are Agents?
Agents in LangChain are built so they can interact with the real world. Agents are a powerful tool that can be used to automate tasks and interact with the real world. Langchain agents can be used to perform a variety of tasks, such as Answering questions, Generating text, Translating languages, Summarizing text, etc.
Types of Agents in LangChain
To decide which actions to take and in what order, agents use an LLM.
1. Zero-shot ReAct
Zero-shot ReAct Agent is a language generation model that can be used to create a realistic variety of contexts, even if it has not been trained on data from those contexts. It can be used for a variety of tasks, such as generating creative text formats, translating languages, and writing different kinds of creative content.
Code
from langchain.agents import initialize_agent, load_tools, AgentType
from langchain.llms import OpenAI
llm = OpenAI(openai_api_key="your_api_key")
tools = load_tools(["wikipedia", "llm-math"], llm=llm)
agent = initialize_agent(tools , llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)
output_1=agent.run("4 + 5 is ")
output_2=agent.run("when you add 4 and 5 the result comes 10.")
print(output_1)
print(output_2)
In the above code, the Langchain library is imported, and then we set the OpenAI key in open_api_key. The code shows that the AI agent with tools for Wikipedia and mathematical information and specifies the agent type as a ZERO_SHOT_REACT_DESCRIPTION agent. And providing two prompts to show that this is for one-time interaction.
Output
2. Conversational ReAct
This agent is designed to be used in conversational settings. The prompt is designed to make the agent helpful and conversational. It uses the React framework to decide which tool to use and uses memory to remember previous conversation interactions.
Code
from langchain.agents import initialize_agent, load_tools
from langchain.llms import OpenAI
from langchain.memory import ConversationBufferMemory
llm = OpenAI(openai_api_key="...")
tools = load_tools(["llm-math"], llm=llm)
memory = ConversationBufferMemory(memory_key="chat_history")
conversational_agent = initialize_agent(
agent='conversational-react-description',
tools=tools,
llm=llm,
verbose=True,
max_iterations=3,
memory=memory,)
output_1=conversational_agent.run("when you add 4 and 5 the result comes 10.")
output_2=conversational_agent.run("4 + 5 is ")
print(output_1)
print(output_2)
The above code demonstrates that the Langchain library is installed and set the OpenAI key, loads the specific tools like llm-math for mathematical operations, and then sets up conversation buffer memory. By using this memory, it allows to store the messages.
Output
3. ReAct Docstore
To communicate with a docstore, this agent uses the React framework. A Search tool and a Lookup tool must both be available, and they must both have the exact same names. The Search tool should search for a document, while the Lookup tool should look up a term in the most recently found document.
Code
from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI
from langchain import Wikipedia
from langchain.agents.react.base import DocstoreExplorer
llm = OpenAI(openai_api_key="...")
docstore = DocstoreExplorer(Wikipedia())
tools=[
Tool(name="Search", func=docstore.search, description="useful for when you need to ask with search"),
Tool(name="Lookup", func=docstore.lookup, description="useful for when you need to ask with lookup")]
react_agent= initialize_agent(tools, llm, agent="react-docstore")
print(react_agent.run("Full name of Narendra Modi is Narendra Damodardas Modi?")) # look on the keywords then go for search
print(react_agent.run("Full name of Narendra Modi is Narendra Damodardas Modi."))
The above code import all the necessary modules from Langchain and initializes an OpenAI language model (llm) with an API key. It also sets up a document store explorer (docstore) using Wikipedia as a source. Two tools (Search and Lookup) are defined, where the Search tool searches for a document, and the Lookup tool looks for a term.
Output
4. Self-ask with Search
This agent only uses one tool, which seems to be called Intermediate Answer.
Code
from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI
from langchain import Wikipedia
llm = OpenAI(openai_api_key="...")
wikipedia = Wikipedia()
tools = [
Tool(
name="Intermediate Answer",
func=wikipedia.search,
description='wikipedia search'
)]
agent = initialize_agent(
tools=tools,
llm=llm,
agent="self-ask-with-search",
verbose=True,
)
print(agent.run("what is the capital of Japan?"))
This code imports necessary modules from the Langchain library, including agents and language models, to set up a conversational agent. It initializes a specific agent configuration called self-ask-with-search with the ability to use tools for tasks, here specifically a tool named Intermediate Answer which performs a Wikipedia search.
Output
Summary
This article provides a clear explanation of the Agents of Langchain library with its types and examples.
FAQs
Q. What is the difference between chain and agent in LangChain?
A. The main difference between agents and chains in Langchain is that agents use a language model to decide what actions to take, while chains have a fixed sequence of actions defined by the developer. Agents would use a language model to generate their responses based on the user's input and the available tools. A chain defines an immediate input/output process, where the agents allow a step-by-step thought process.
Q. What does Verbose() do?
A. The verbose option specifies that you want to display detailed processing information on your screen.
Q. What is the temperature in LangChain?
A. By default, LangChain creates the chat model with a temperature value of 0.7. The temperature parameter adjusts the randomness of the output. Higher values like 0.7 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.