Prerequisites
- Docker installed and running
- OpenAI API key
Setup Steps
The container removes all data when stopped. Use volume mounts for persistent storage.
API Mode (Shared Knowledge Graph)
To connect multiple clients to a shared knowledge graph, run MCP in API mode pointing to a centralized Cognee backend:Start MCP in API Mode
Start the MCP server and point it to the backend:The container automatically converts
localhost to host.docker.internal so the MCP container can reach your host machine. The MCP server now acts as an interface to the shared backend.- The API mode requires SSE or HTTP transport
- The
localhostinAPI_URLis automatically mapped to work from inside the container - Add
-e API_TOKEN=your_tokenif your backend requires authentication
Docker Compose (Production Setup)
For production deployments, use Docker Compose to run the Cognee backend and MCP server together. This avoidslocalhost mapping issues and uses Docker’s internal DNS for service discovery.
docker-compose.yml
Networking notes:
- Use the service name (
cognee-backend) as the hostname inAPI_URL— Docker resolves it automatically within the same network. - Use the internal port (
8000) inAPI_URL, not the host-mapped port (8080). - If you place a reverse proxy (Nginx, Caddy) in front, you do not need to set a
Host: localhostheader — the backend accepts requests on any host. - Add
-e API_TOKEN=your_tokento the MCP service if your backend requires authentication.
Connect to AI Clients
After starting the server, connect it to your AI development tool:Cursor
AI-powered code editor with native MCP support
Claude Code
Command-line AI assistant from Anthropic
Cline
VS Code extension for AI-assisted development
Continue
Open-source AI coding assistant
Roo Code
AI-powered development environment
Next Steps
Tools Reference
See all available MCP tools and operations
Local Setup
Run from source for customization and development