Skip to main content
Give your LangGraph agents persistent semantic memory that survives across sessions. Store data in cognee’s knowledge graph and retrieve it via natural language—no manual state management required.

Why Use This Integration

  • Cross-Session Memory: Context persists across agent instances and conversation sessions
  • Semantic Search: Retrieve information using natural language queries
  • Session Isolation: Multi-tenant support with per-user data separation
  • Zero Setup: Works with LangGraph’s create_react_agent out of the box

Installation

pip install langgraph-cognee

Quick Start

Add memory tools to your LangGraph agent:
from langgraph.prebuilt import create_react_agent
from langgraph_cognee import get_sessionized_cognee_tools
from langchain_core.messages import HumanMessage

# Get memory tools
add_tool, search_tool = get_sessionized_cognee_tools()

# Create agent with memory
agent = create_react_agent(
    "openai:gpt-4o-mini",
    tools=[add_tool, search_tool],
)

# Store and retrieve information
response = agent.invoke({
    "messages": [
        HumanMessage(content="Remember: Acme Corp, healthcare, $1.2M contract"),
        HumanMessage(content="What healthcare contracts do we have?")
    ],
})

Cross-Session Persistence

Memory persists across different agent instances:
# Session 1: Store information
agent_1 = create_react_agent(
    "openai:gpt-4o-mini",
    tools=get_sessionized_cognee_tools(),
)
agent_1.invoke({
    "messages": [HumanMessage(content="I'm working on authentication")]
})

# Session 2: Different instance, same memory
agent_2 = create_react_agent(
    "openai:gpt-4o-mini", 
    tools=get_sessionized_cognee_tools(),
)
response = agent_2.invoke({
    "messages": [HumanMessage(content="What was I working on?")]
})
# Returns: "authentication module"

Custom Session IDs

Control session isolation with custom session IDs:
# User-specific memory
user_tools = get_sessionized_cognee_tools(session_id="user_123")

# Org-specific memory
org_tools = get_sessionized_cognee_tools(session_id="org_acme")
Each session maintains separate memory clusters while allowing global data access when needed.

How It Works

  1. Add Tool: Stores data in cognee’s knowledge graph with embeddings
  2. Search Tool: Retrieves relevant information via semantic search
  3. Auto-Processing: cognee extracts entities, relationships, and context automatically
  4. Session Scoping: Data is isolated by session but globally accessible

Use Cases

Build domain knowledge incrementally over multiple sessions:
for doc in knowledge_base:
    agent.invoke({"messages": [HumanMessage(content=f"Learn: {doc}")]})
Maintain user context across work sessions:
# Monday
agent.invoke({"messages": [HumanMessage(content="Debugging payment flow")]})

# Wednesday
agent.invoke({"messages": [HumanMessage(content="What was I debugging?")]})
Isolate data per user/organization while sharing global knowledge:
# Per-user isolation
user_tools = get_sessionized_cognee_tools(session_id=user_id)
agent = create_react_agent(model, tools=user_tools)

I