Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt

Use this file to discover all available pages before exploring further.

Cognee runs as an MCP (Model Context Protocol) server. Any MCP-compatible client (Claude Desktop, Cursor, VS Code Copilot) can connect to it as a tool provider. The MCP server can run locally or connect to your Cognee Cloud tenant.

Step 1 — Start the MCP server

Connect the MCP server to Cognee Cloud using your API Base URL and API key from the API Keys page:
cognee-mcp --transport sse --port 8001 \
  --serve-url https://your-tenant.aws.cognee.ai \
  --serve-api-key your-api-key
For local mode (no Cloud connection), omit the --serve-url and --serve-api-key flags. The server will manage its own local knowledge graph. Local mode requires an LLM_API_KEY environment variable.

Step 2 — Add to your MCP client config

Add Cognee as a tool server in your MCP client’s configuration:
{
  "mcpServers": {
    "cognee": {
      "url": "http://localhost:8001/sse"
    }
  }
}

Step 3 — Available tools

Once connected, your MCP client gets these tools:
ToolDescription
rememberStore data in memory (add + cognify in one step)
recallSearch memory with auto-routing
cognifyBuild the knowledge graph from ingested data
searchQuery the knowledge graph with different search types
improveEnrich the knowledge graph and bridge session data
forget_memoryDelete data from memory
list_dataList datasets and their contents
For detailed setup per client, see the MCP integration guides.

Next steps

Cloud SDK

Connect to Cognee Cloud programmatically using the Python SDK.

Cloud functionality

Explore the full API surface available in Cognee Cloud.