Cognee MCP (Model Context Protocol) integration provides seamless connectivity between Cognee’s knowledge graph platform and MCP-compatible AI development tools like Claude, Cursor, and Continue.
MCP enables AI assistants to securely access and manage your knowledge graphs, creating persistent memory across conversations.

Quick Start

Get up and running with Cognee MCP in minutes using Docker.

Prerequisites

Before you begin, ensure you have:

Docker

Docker installed and running on your system

OpenAI API Key

Valid OpenAI API key for LLM operations

Environment Setup

First, set your OpenAI API key:
export OPENAI_API_KEY=your_api_key_here
Replace your_api_key_here with your actual OpenAI API key. Keep this secure and never commit it to version control.

Launch the Server

Start the Cognee MCP server with a single command:
docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
The server will run on port 8000. Deleting the container will remove all stored memory data.

How Cognee MCP Works

Built around the Model Context Protocol (MCP), the Cognee MCP Server exposes a standardized set of memory management tools that any MCP-compatible AI assistant can use.

Available Tools

Memory Management

Setup Options

Choose your preferred deployment method:

Docker (Quick Start)

Recommended for most usersReady-to-use container with all dependencies included. Perfect for getting started quickly.

Local Setup

For developers & customizationBuild from source for development, custom configurations, and latest features.

Next Steps