This project implements a stateful conversational AI agent using the LangGraph library. Unlike standard stateless LLM calls, this agent maintains a persistent memory of the conversation across multiple user turns, enabling it to remember context (e.g., user names, pervious questions) throughout the session.
The goal of this project is to demonstrate State Management in AI applications. It uses a graph based architecture where the "State" flows through nodes, allowing the application to persist data over time.
Key Features:
- Persistent Memory: Uses
MemorySaverto store conversation history in RAM during the runtime of the application. - Graph Architecture: Built on LangGraph's
StateGraph, defining nodes (Chatbot) and edges (Start -> Chatbot -> End). - Context Awareness: The bot can recall information provided in previous messages within the same session thread.
- Streaming Responses: The application streams responses back to the user in real time.
- Python 3.10+
- Groq API Key (or OpenAI API Key)
- Clone the Repository
git clone https://github.com/shivamgravity/langgraph cd langgraph - Set Up Virtual Environment
# create the virtual environment python -m venv venv # activate the virtual environment (Windows) venv\Scripts\activate # activate the virual environment (Mac/Linux) source venv/bin/activate
- Install Dependencies
pip install -r requirements.txt
- Configure Environment Variables: Create a
.envfile in the root directory and add your API key.GROQ_API_KEY=your_api_key_here
- Start the Agent: Run the main python script.
Note: run this command under the virtual environment
python agent.py
- Interact: The bot will initialize and greet you. You can type messages in the terminal.
- Example Flow:
- You: "Hi, my name is Aditya."
- Bot: "Hello Aditya..."
- You: "What is my name?"
- Bot: "Your name is Aditya." (This confirms the state persistence)
- Example Flow:
- Exit: Type
quitorexitto end the session.
This project uses a Stateful Graph architecture powered by LangGraph.
- State Schema (
TypedDict): TheStateis defined as a dictionary containing a list ofmessages. Theadd_messagesreducer is used to ensure that new messages are appended to the history rather than overwriting it. - Nodes:
chatbotNode: This function invokes the LLM (Llama-3 via Groq). It takes the current state (message history) as input and returns a new message from the AI.
- Edges:
START->chatbot: The entry point of the graph.chatbot->END: The graph finishes after generating a response (waiting for the next user input to trigger a new run).
- Checkpointer (
MemorySaver): This component is critical for persistence. It saves the state of the graph after every step. When a new message arrives with the samethread_id, LangGraph loads the previous state, effectively giving the bot "memory."