Skip to main content

Does Cognee MCP work with Cognee Cloud?

Cognee MCP and Cognee Cloud are complementary but separate systems. They are designed for different use cases and do not share the same backend connection mechanism.
Cognee MCPCognee Cloud
Where it runsLocally on your machineHosted by Cognee
Access methodMCP protocolcogwit-sdk or Cloud UI
AuthenticationBearer token (self-hosted)X-Api-Key header
API endpoints/api/v1/add, /api/v1/search/api/add, /api/search
Use caseAI IDE tools (Cursor, Claude Code, etc.)Cloud-managed knowledge graphs

Why API_URL only works with self-hosted Cognee backends

The MCP server’s API_URL / API_TOKEN mechanism is designed for self-hosted Cognee backends. When the MCP server runs in API mode, it:
  • Sends requests to /api/v1/add, /api/v1/cognify, /api/v1/search
  • Authenticates with Authorization: Bearer <token>
Cognee Cloud uses a different convention:
  • Endpoint paths strip the /v1 prefix (e.g., /api/add, /api/cognify)
  • Authentication uses the X-Api-Key header, not a Bearer token
Because of these differences, pointing API_URL at the Cognee Cloud endpoint and setting API_TOKEN to your Cognee Cloud API key will not work as expected.
Run the MCP server in direct/standalone mode (no API_URL set). The server manages its own local knowledge graph.
# Standalone mode — no API_URL needed
docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
This is the recommended way to add persistent memory to Cursor, Claude Code, Cline, and other MCP-compatible tools.
Access Cognee Cloud programmatically using the cogwit-sdk library, which handles authentication and communication with the hosted service.
pip install cogwit-sdk
export COGWIT_API_KEY="your-cognee-cloud-api-key"
from cogwit_sdk import cogwit, CogwitConfig

client = cogwit(CogwitConfig(api_key="your-cognee-cloud-api-key"))
await client.add(data="...", dataset_name="my_dataset")
await client.cognify(dataset_ids=[...])
results = await client.search(query_text="...", query_type=client.SearchType.GRAPH_COMPLETION)
See the Cognee Cloud SDK guide for a complete walkthrough.
If you want multiple AI clients to share a single knowledge graph, run a self-hosted Cognee backend and point the MCP server at it using API_URL and API_TOKEN:
# 1. Start a self-hosted Cognee backend
docker run -e LLM_API_KEY=your_key -p 8080:8000 --rm -it cognee/cognee:main

# 2. Start MCP in API mode pointing to your backend
docker run \
  -e TRANSPORT_MODE=sse \
  -e API_URL=http://localhost:8080 \
  -e API_TOKEN=your_backend_token \
  -p 8000:8000 --rm -it cognee/cognee-mcp:main
See the MCP Quickstart for full details on this pattern.
For questions about future Cognee Cloud + MCP integration, join the Cognee Discord to discuss with the team.