01Jul


LangGraph is a fairly recent addition to the ever expanding LangChain ecosystem. With the launch of LangGraph Cloud, a managed, hosted service is introduced for deploying and hosting LangGraph applications.

The LangChain ecosystem is unfolding at a rapid pace, with a combination of Open Source Software (OSS) and Commercial software. The Commercial software includes LangSmith & LangGraph Cloud.

Source

We are all starting to realise that Agentic Applications will become a standard in the near future. The advantages of Agents are numerous…but to name a few:

  1. Agents can handle complex, ambiguous and more implicit user queries in an automated fashion.
  2. Underpinning agents is the capability to create a chain of events on the fly based on the task assigned by the user.
  3. Agents make use of an LLM which acts as the backbone of the agent.
  4. When the agent receives a user query, the agent decomposes the task into sub-tasks, which are then executed in a sequential fashion.
  5. One or more tools are made available to the Agent which can be employed by the agent as the agent deems fit. The agent decides which tool to use based on a tool description which forms part of each tool.
  6. A tool is a unit of capability which includes tasks like web search, mathematics, API calls and more.

Impediments and apprehension to Agent adoption included:

  1. LLM inference cost. The backbone LLM are queried multiple times during the course of a query, should an agent have a large number of users inference cost can skyrocket.
  2. Controllability, inspectability, observability and a more granular control are much needed. In the market there is this fear that agents are too autonomous.
  3. Agents broke the glass ceiling of chatbots, but by a little too much; and some measure of control is now required.
  4. For more complex agents, to decrease latency, there is a requirement to run tasks in parallel, and also stream not only LLM responses, but agent responses as it becomes available.

LangGraph is framework-agnostic, with each node functioning as a regular Python function.

It extends the core Runnable API (a shared interface for streaming, async, and batch calls) to facilitate:

  1. Seamless state management across multiple conversation turns or tool calls.
  2. Flexible routing between nodes based on dynamic criteria
  3. Smooth transitions between LLMs and human intervention
  4. Persistence for long-running, multi-session applications

Below the basic personal workflow is shown. A user will develop their LangGraph application within their IDE of choice. From here they will push their code to GitHub.

From LangGraph Cloud the GitHub code can be accessed and deployed to LangGraph Cloud. From LangGraph Cloud applications can be tested, traces can be run, interrupt can be added, and more.



Source link

Protected by Security by CleanTalk