Prerequisites
- Docker installed and running
- OpenAI API key
Setup Steps
1
Set Your API Key
2
Create Environment File
3
Start the Server
4
Verify the Server
The container removes all data when stopped. Use volume mounts for persistent storage.
API Mode (Shared Knowledge Graph)
To connect multiple clients to a shared knowledge graph, run MCP in API mode pointing to a centralized Cognee backend:1
Start Cognee Backend
First, start a Cognee backend instance:
2
Start MCP in API Mode
Start the MCP server and point it to the backend:The container automatically converts
localhost to host.docker.internal so the MCP container can reach your host machine. The MCP server now acts as an interface to the shared backend.3
Connect Additional Clients (Optional)
If you need to support multiple clients, start additional MCP instances on different ports:Each client connects to its own MCP instance, but all share the same knowledge graph through the backend.
- The API mode requires SSE or HTTP transport
- The
localhostinAPI_URLis automatically mapped to work from inside the container - Add
-e API_TOKEN=your_tokenif your backend requires authentication
Connect to AI Clients
After starting the server, connect it to your AI development tool:Cursor
AI-powered code editor with native MCP support
Claude Code
Command-line AI assistant from Anthropic
Cline
VS Code extension for AI-assisted development
Continue
Open-source AI coding assistant
Roo Code
AI-powered development environment