Skip to Content
TutorialsOne liner install (Docker & MCP)

Set up cognee with MCP

Learn how to run cognee with MCP (Model Context Protocol) using Docker for seamless integration with AI coding assistants.

What is MCP

Model Context Protocol (MCP) is an open standard for connecting AI assistants to data sources and tools. It enables:

  • Seamless Integration: Connect AI assistants to external data without custom APIs
  • Standardized Communication: Universal protocol for AI-data interactions
  • Real-time Context: Provide fresh, relevant data to AI assistants
  • Tool Orchestration: Enable AI assistants to use external tools and services

cognee-mcp implements this protocol to provide AI assistants with:

  • Knowledge graph search capabilities
  • Data ingestion and processing tools
  • Semantic memory access
  • Graph visualization and exploration

Get started in one line

Quick Start with Docker Hub

The fastest way to get cognee-mcp running:

# Create .env file with your API key echo 'LLM_API_KEY="your_openai_api_key"' > .env # Run cognee-mcp container docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

That’s it! Your cognee-mcp server is now running on http://localhost:8000.

Container Setup Options

Pull the pre-built image and run it:

# With your .env file docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

Advantages:

  • No build time required
  • Always up-to-date with latest releases
  • Smaller download than building locally

Option 2: Build Locally

Build the container yourself for custom modifications:

# Make sure you are in /cognee root directory # Create .env with only your LLM_API_KEY and chosen settings # Remove any old image and rebuild docker rmi cognee/cognee-mcp:main || true docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main . # Run it docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

Advantages:

  • Full control over build process
  • Can modify source code before building
  • Useful for development and testing

Configure your MCP

Basic Configuration

Create a .env file with minimal configuration:

# Required: LLM API Key LLM_API_KEY="your_openai_api_key" # Optional: Database configuration GRAPH_DATABASE_PROVIDER="networkx" VECTOR_DB_PROVIDER="lancedb" DB_PROVIDER="sqlite"

Advanced Configuration

You can use any configuration from our .env.template:

# Custom LLM Provider LLM_PROVIDER="custom" LLM_MODEL="openrouter/google/gemini-2.0-flash-lite-preview-02-05:free" LLM_ENDPOINT="https://openrouter.ai/api/v1" # Production Database Setup GRAPH_DATABASE_PROVIDER="neo4j" GRAPH_DATABASE_URL="bolt://localhost:7687" GRAPH_DATABASE_USERNAME="neo4j" GRAPH_DATABASE_PASSWORD="your_password" # Vector Storage VECTOR_DB_PROVIDER="pgvector"

Connection Protocols

cognee-mcp supports two connection methods:

SSE (Server-Sent Events) - Default:

# Default - no additional configuration needed docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

STDIO (Standard Input/Output):

# Set environment variable for STDIO mode echo 'MCP_TRANSPORT=stdio' >> .env docker run --env-file ./.env --rm -it cognee/cognee-mcp:main

Connect your MCP to your AI app

Example: Continue.dev Integration

Continue.dev is a popular AI coding assistant. Here’s how to connect it to cognee-mcp:

  1. Start cognee-mcp server:
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
  1. Configure Continue.dev:

Add to your ~/.continue/config.json:

{ "models": [ { "title": "GPT-4 with Cognee", "provider": "openai", "model": "gpt-4", "apiKey": "your_openai_api_key" } ], "mcpServers": { "cognee": { "command": "curl", "args": [ "-X", "POST", "http://localhost:8000/mcp", "-H", "Content-Type: application/json", "-d", "@-" ], "transport": "sse" } } }
  1. Use cognee in Continue:

Now you can use cognee tools in your conversations:

  • @cognee search "machine learning concepts"
  • @cognee add_data "path/to/your/documents"
  • @cognee cognify to process your data

Example: Cursor Integration

For Cursor integration:

  1. Start cognee-mcp:
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
  1. Configure Cursor MCP:

In Cursor settings, add MCP server:

{ "mcp": { "servers": { "cognee": { "command": "node", "args": ["-e", "fetch('http://localhost:8000/mcp', {method: 'POST', body: process.stdin})"], "transport": "sse" } } } }

Available MCP Tools

When connected, your AI assistant gains access to these cognee tools:

Data Management

  • cognee_add_data: Add files, text, or URLs to cognee
  • cognee_prune: Clear knowledge graph data
  • cognify: Process data into knowledge graphs
  • codify: Generate code-specific knowledge graphs

Search & Query

  • cognee_search: Search knowledge graphs with different strategies
  • search_insights: Get relationship-based insights
  • search_chunks: Find relevant text chunks

Status & Monitoring

  • cognify_status: Check data processing progress
  • codify_status: Monitor code analysis progress

More Complex Examples

Multi-Repository Code Analysis

# Start cognee-mcp docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # In your AI assistant: # 1. Add multiple repositories @cognee add_data "/path/to/repo1" @cognee add_data "/path/to/repo2" # 2. Generate code knowledge graphs @cognee codify "/path/to/repo1" @cognee codify "/path/to/repo2" # 3. Search across repositories @cognee search "authentication implementation patterns"

Document Intelligence Workflow

# Add various document types @cognee add_data "/path/to/docs/*.pdf" @cognee add_data "/path/to/markdown/*.md" @cognee add_data "https://company-wiki.example.com" # Process into knowledge graph @cognee cognify # Query with different strategies @cognee search_insights "What are the main product requirements?" @cognee search_chunks "API documentation for user authentication"

Development with Hot Reload

For active development, mount your code directory:

docker run --env-file ./.env -p 8000:8000 \ -v /path/to/your/cognee:/app \ --rm -it cognee/cognee-mcp:main

This allows you to modify cognee code and see changes immediately.

Troubleshooting

Common Issues

Port already in use:

# Check what's using port 8000 lsof -i :8000 # Use a different port docker run --env-file ./.env -p 8001:8000 --rm -it cognee/cognee-mcp:main

Permission errors:

# Ensure .env file is readable chmod 644 .env # Check Docker permissions docker run --user $(id -u):$(id -g) --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

Memory issues with large datasets:

# Increase container memory docker run --memory=4g --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main

Logs and Debugging

View container logs:

# Run in detached mode docker run -d --name cognee-mcp --env-file ./.env -p 8000:8000 cognee/cognee-mcp:main # View logs docker logs -f cognee-mcp # Stop container docker stop cognee-mcp

Next Steps

Join the Community

Have questions about MCP setup? Join our community:

Last updated on