LangGraph is a module built on top of LangChain to better enable creation of cyclical graphs, often needed for agent runtimes.
One of the big value props of LangChain is the ability to easily create custom chains, also known as flow engineering. Combining LangGraph with LangChain agents, agents can be both directed and cyclic.
A Directed Acyclic Graph (DAG) is a type of graph used in computer science and mathematics. Here’s a simple explanation:
Directed: Each connection (or edge) between nodes (or vertices) has a direction, like a one-way street. It shows which way you can go from one node to another.
Acyclic: It doesn’t have any cycles. This means if you start at one node and follow the directions, you can never return to the same node. There’s no way to get stuck in a loop.
Imagine it as a family tree or a flowchart where you can only move forward and never return to the same point you started from.
A common pattern observed in developing more complex LLM applications is the introduction of cycles into the runtime. These cycles frequently use the LLM to determine the next step in the process.
A significant advantage of LLMs is their capability to perform these reasoning tasks, essentially functioning like an LLM in a for-loop. Systems employing this approach are often referred to as agents.
However, looping agents often require granular control at various stages.
Makers might need to ensure that an agent always calls a specific tool first or seek more control over how tools are utilised.
Additionally, they may want to use different prompts for the agent depending on its current state.
At its core, LangGraph provides a streamlined interface built on top of LangChain.
LangGraph is framework-agnostic, with each node functioning as a regular Python function.
It extends the core Runnable API (a shared interface for streaming, async, and batch calls) to facilitate:
- Seamless state management across multiple conversation turns or tool usages.
- Flexible routing between nodes based on dynamic criteria
- Smooth transitions between LLMs and human intervention
- Persistence for long-running, multi-session applications
Below is a working LangChain chatbot, based on the Anthropic model. The base code is copied from LangChain example code in their cookbook.
%%capture --no-stderr
%pip install -U langgraph langsmith# Used for this tutorial; not a requirement for LangGraph
%pip install -U langchain_anthropic
#################################
import getpass
import os
def _set_env(var: str):
if not os.environ.get(var):
os.environ[var] = getpass.getpass(f"{var}: ")
_set_env("ANTHROPIC_API_KEY")
#################################
from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph import StateGraph
from langgraph.graph.message import add_messages
class State(TypedDict):
# Messages have the type "list". The `add_messages` function
# in the annotation defines how this state key should be updated
# (in this case, it appends messages to the list, rather than overwriting them)
messages: Annotated[list, add_messages]
graph_builder = StateGraph(State)
#################################
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(model="claude-3-haiku-20240307")
def chatbot(state: State):
return {"messages": [llm.invoke(state["messages"])]}
# The first argument is the unique node name
# The second argument is the function or object that will be called whenever
# the node is used.
graph_builder.add_node("chatbot", chatbot)
#################################
graph_builder.set_entry_point("chatbot")
#################################
graph_builder.set_finish_point("chatbot")
#################################
graph = graph_builder.compile()
#################################
from IPython.display import Image, display
try:
display(Image(graph.get_graph().draw_mermaid_png()))
except Exception:
# This requires some extra dependencies and is optional
pass
#################################
while True:
user_input = input("User: ")
if user_input.lower() in ["quit", "exit", "q"]:
print("Goodbye!")
break
for event in graph.stream({"messages": ("user", user_input)}):
for value in event.values():
print("Assistant:", value["messages"][-1].content)
#################################
Below the snipped showing how the graphic rendering the flow.