Cognee runs as an MCP (Model Context Protocol) server. Any MCP-compatible client (Claude Desktop, Cursor, VS Code Copilot) can connect to it as a tool provider. The MCP server can run locally or connect to your Cognee Cloud tenant.Documentation Index
Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt
Use this file to discover all available pages before exploring further.
Step 1 — Start the MCP server
Connect the MCP server to Cognee Cloud using your API Base URL and API key from the API Keys page:For local mode (no Cloud connection), omit the
--serve-url and --serve-api-key flags. The server will manage its own local knowledge graph. Local mode requires an LLM_API_KEY environment variable.Step 2 — Add to your MCP client config
Add Cognee as a tool server in your MCP client’s configuration:Step 3 — Available tools
Once connected, your MCP client gets these tools:| Tool | Description |
|---|---|
remember | Store data in memory (add + cognify in one step) |
recall | Search memory with auto-routing |
cognify | Build the knowledge graph from ingested data |
search | Query the knowledge graph with different search types |
improve | Enrich the knowledge graph and bridge session data |
forget_memory | Delete data from memory |
list_data | List datasets and their contents |
Next steps
Cloud SDK
Connect to Cognee Cloud programmatically using the Python SDK.
Cloud functionality
Explore the full API surface available in Cognee Cloud.