Build and run Cognee MCP from source to access advanced customization, multiple transport options, and the latest development features, including the current memory-oriented MCP tools.Documentation Index
Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt
Use this file to discover all available pages before exploring further.
Advantages of Local Setup
- Full Control: Customize server configuration, add providers, and modify behavior
- Latest Features: Access development features before they reach Docker releases
- Multiple Transports: Choose stdio, SSE, or HTTP transport modes
- Current Tool Surface: Use
remember,recall,improve, andforget_memoryalongside compatibility tools - Development Ready: Debug, modify, and contribute to the codebase
Setup Steps
Running in API Mode
To connect the MCP server to an existing Cognee backend instead of running standalone:--api-url is provided, the MCP server acts as an interface to the centralized backend. This allows multiple MCP instances and clients to share the same knowledge graph.
You can also pass these as command-line arguments:
The source-run
uv run cognee-mcp entrypoint reads --api-url and --api-token flags. The API_URL and API_TOKEN environment variables are used by the Docker entrypoint wrapper, not by the source runner directly.- Team collaboration with shared memory
- Multiple AI clients accessing consistent data
- Centralized knowledge graph management
Further details
Transport Modes
Transport Modes
Choose the transport mode based on your client requirements:
- stdio (default)
- HTTP
- SSE
Default mode for most MCP clients. The client starts the server as a subprocess and communicates through standard input/output.Configure your MCP client to launch the server directly:Replace
/absolute/path/to/cognee-mcp with the actual path to your cloned cognee-mcp directory.Server Arguments Reference
Server Arguments Reference
All available arguments for
Example with all options:
uv run cognee-mcp:| Argument | Default | Description |
|---|---|---|
--transport | stdio | Transport protocol: stdio, http, or sse |
--host | 127.0.0.1 | Host to bind the server to (HTTP/SSE only) |
--port | 8000 | Port to bind the server to (HTTP/SSE only) |
--path | /mcp | URL path for the HTTP endpoint |
--log-level | info | Log verbosity: debug, info, warning, or error |
--no-migration | off | Skip database migrations on startup |
--api-url | — | URL of a running Cognee backend (enables API mode) |
--api-token | — | Auth token for the backend API (if required) |
--serve-url | — | Cognee Cloud or remote instance URL used with cognee.serve() |
--serve-api-key | — | API key for the --serve-url instance |
The preferred MCP workflow is to use
remember, recall, improve, and forget_memory. The server also exposes legacy-compatible tools such as cognify and search when you need lower-level control.Next Steps
After starting the server, configure your AI client to connect to it. See the integrations section for client-specific setup instructions.Need Help?
Join Our Community
Get support and connect with other developers using Cognee MCP.