Start the Cognee MCP server using Docker to quickly test AI memory integration.Documentation Index
Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- Docker installed and running
- OpenAI API key
Setup Steps
Persist Data
By default, the container removes its local data when it stops. Use a bind mount or named Docker volume if you want memory to survive restarts.cognee_data with either:
- A named Docker volume, such as
cognee_data:/app/data - A local directory path, such as
./cognee_data:/app/data
API Mode (Shared Knowledge Graph)
To connect multiple clients to a shared knowledge graph, run MCP in API mode pointing to a centralized Cognee backend:Start MCP in API Mode
Start the MCP server and point it to the backend:The container rewrites
localhost in API_URL to host.docker.internal so the MCP container can reach a backend running on your host machine. This works on macOS, Windows, and Linux setups that provide host.docker.internal such as Docker Desktop. On Linux without that support, use --network host or set API_URL to the Docker bridge IP instead. The MCP server now acts as an interface to the shared backend.- The API mode requires SSE or HTTP transport
- If
API_URLuseslocalhost, the container rewrites it tohost.docker.internal - This works on macOS, Windows, and Linux environments where
host.docker.internalis available - On Linux without that support, use
--network hostor a bridge address such as172.17.0.1 - Add
-e API_TOKEN=your_tokenif your backend requires authentication - For backend authentication setup and how to obtain a Bearer token, see Deploy REST API Server
Docker Compose (Production Setup)
For production deployments, use Docker Compose to run the Cognee backend and MCP server together. This avoidslocalhost mapping issues and uses Docker’s internal DNS for service discovery.
docker-compose.yml
Networking notes:
- Use the service name (
cognee-backend) as the hostname inAPI_URL— Docker resolves it automatically within the same network. - Use the internal port (
8000) inAPI_URL, not the host-mapped port (8080). - If you place a reverse proxy (Nginx, Caddy) in front, you do not need to set a
Host: localhostheader — the backend accepts requests on any host. - Add
-e API_TOKEN=your_tokento the MCP service if your backend requires authentication.
Connect to AI Clients
After starting the server, connect it to your AI development tool:Cursor
AI-powered code editor with native MCP support
Claude Code
Command-line AI assistant from Anthropic
Codex
OpenAI coding agent with built-in MCP support
Cline
VS Code extension for AI-assisted development
Continue
Open-source AI coding assistant
Roo Code
AI-powered development environment
Next Steps
Tools Reference
See all available MCP tools and operations
Local Setup
Run from source for customization and development