Skip to content

shivamgravity/langgraph

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Stateful Conversational AI Agent

This project implements a stateful conversational AI agent using the LangGraph library. Unlike standard stateless LLM calls, this agent maintains a persistent memory of the conversation across multiple user turns, enabling it to remember context (e.g., user names, pervious questions) throughout the session.

Project Overview

The goal of this project is to demonstrate State Management in AI applications. It uses a graph based architecture where the "State" flows through nodes, allowing the application to persist data over time.

Key Features:

  • Persistent Memory: Uses MemorySaver to store conversation history in RAM during the runtime of the application.
  • Graph Architecture: Built on LangGraph's StateGraph, defining nodes (Chatbot) and edges (Start -> Chatbot -> End).
  • Context Awareness: The bot can recall information provided in previous messages within the same session thread.
  • Streaming Responses: The application streams responses back to the user in real time.

Prerequisites

  • Python 3.10+
  • Groq API Key (or OpenAI API Key)

Setup & Installation

  1. Clone the Repository
    git clone https://github.com/shivamgravity/langgraph
    cd langgraph
  2. Set Up Virtual Environment
    # create the virtual environment
    python -m venv venv
    
    # activate the virtual environment (Windows)
    venv\Scripts\activate
    
    # activate the virual environment (Mac/Linux)
    source venv/bin/activate
  3. Install Dependencies
    pip install -r requirements.txt
  4. Configure Environment Variables: Create a .env file in the root directory and add your API key.
    GROQ_API_KEY=your_api_key_here
    

How to Run

  1. Start the Agent: Run the main python script.
    python agent.py
    Note: run this command under the virtual environment
  2. Interact: The bot will initialize and greet you. You can type messages in the terminal.
    • Example Flow:
      • You: "Hi, my name is Aditya."
      • Bot: "Hello Aditya..."
      • You: "What is my name?"
      • Bot: "Your name is Aditya." (This confirms the state persistence)
  3. Exit: Type quit or exit to end the session.

Solution Architecture

This project uses a Stateful Graph architecture powered by LangGraph.

  • State Schema (TypedDict): The State is defined as a dictionary containing a list of messages. The add_messages reducer is used to ensure that new messages are appended to the history rather than overwriting it.
  • Nodes:
    • chatbot Node: This function invokes the LLM (Llama-3 via Groq). It takes the current state (message history) as input and returns a new message from the AI.
  • Edges:
    • START -> chatbot: The entry point of the graph.
    • chatbot -> END: The graph finishes after generating a response (waiting for the next user input to trigger a new run).
  • Checkpointer (MemorySaver): This component is critical for persistence. It saves the state of the graph after every step. When a new message arrives with the same thread_id, LangGraph loads the previous state, effectively giving the bot "memory."

About

Using LangGraph to build a Conversational AI Agent with memory that persists chat history across sessions.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages