Connect your own Python LLM agent to Cognee MCP to give it persistent knowledge graph memory. The mcp Python SDK lets you call all Cognee MCP tools programmatically — no IDE or chat client required.
Prefer the v1.0 memory tools (remember, recall, forget_memory, improve) for new agent integrations. The legacy tools (cognify, search, delete) are still available when you need lower-level control.
Prerequisites
Python 3.10+
LLM_API_KEY environment variable set (OpenAI key or equivalent)
mcp package installed: pip install "mcp>=1.12.0"
Connection Options
Option 1: stdio
Option 2: HTTP
Launch the MCP server as a subprocess. The mcp SDK handles the process lifecycle and communicates over stdin/stdout. # Install cognee-mcp and its dependencies
pip install "mcp>=1.12.0" uv
git clone https://github.com/topoteretes/cognee.git
import asyncio
import os
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
server_params = StdioServerParameters(
command = "uv" ,
args = [ "--directory" , "/path/to/cognee/cognee-mcp" , "run" , "cognee-mcp" ],
env = { ** os.environ, "LLM_API_KEY" : os.environ[ "LLM_API_KEY" ]},
)
async def main ():
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# Store knowledge
await session.call_tool( "remember" , arguments = {
"data" : "Acme Corp signed a $1.2M healthcare contract in Q1 2025." ,
"dataset_name" : "sales" ,
})
# Retrieve context for your LLM prompt
result = await session.call_tool( "recall" , arguments = {
"query" : "healthcare contracts" ,
"search_type" : "GRAPH_COMPLETION" ,
})
context = result.content[ 0 ].text
print (context)
asyncio.run(main())
Replace /path/to/cognee/cognee-mcp with the absolute path to the cognee-mcp directory in your cloned repository. Start the MCP server separately (e.g. with Docker or from source ), then connect over HTTP: # Start the server (HTTP transport, port 8000)
docker run -e TRANSPORT_MODE=http --env-file .env -p 8000:8000 --rm -it cognee/cognee-mcp:main
import asyncio
from mcp.client.streamable_http import streamable_http_client
from mcp import ClientSession
async def main ():
async with streamable_http_client( "http://localhost:8000/mcp" ) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# Store knowledge
await session.call_tool( "remember" , arguments = {
"data" : "Acme Corp signed a $1.2M healthcare contract in Q1 2025." ,
"dataset_name" : "sales" ,
})
# Retrieve context for your LLM prompt
result = await session.call_tool( "recall" , arguments = {
"query" : "healthcare contracts" ,
"search_type" : "GRAPH_COMPLETION" ,
})
context = result.content[ 0 ].text
print (context)
asyncio.run(main())
For SSE transport (server started with --transport sse), use mcp.client.sse.sse_client and connect to http://localhost:8000/sse instead.
Inject context into your LLM calls
Once you have the retrieved context string, pass it to your LLM as part of the system or user prompt:
import openai
client = openai.AsyncOpenAI()
async def answer_with_memory ( session , question : str ) -> str :
result = await session.call_tool( "recall" , arguments = {
"query" : question,
"search_type" : "GRAPH_COMPLETION" ,
})
context = result.content[ 0 ].text
response = await client.chat.completions.create(
model = "gpt-4o-mini" ,
messages = [
{ "role" : "system" , "content" : f "Use this context to answer: \n\n { context } " },
{ "role" : "user" , "content" : question},
],
)
return response.choices[ 0 ].message.content
Key tools for agent context
Tool Purpose rememberv1.0 API — store data with optional session scoping recallv1.0 API — smart retrieval with session awareness forget_memoryv1.0 API — delete a dataset or wipe everything improvev1.0 API — enrich the graph and bridge session memory into permanent memory cognifyLegacy tool — ingest text, files, or URLs into the knowledge graph searchLegacy tool — retrieve context (GRAPH_COMPLETION, RAG_COMPLETION, CHUNKS) cognify_statusPoll background indexing progress pruneReset all memory (useful in tests)
See the Tools Reference for all available tools and parameters.
Need Help?
Join Our Community Get support and connect with other developers using Cognee MCP.