Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt

Use this file to discover all available pages before exploring further.

Build and run Cognee MCP from source to access advanced customization, multiple transport options, and the latest development features, including the current memory-oriented MCP tools.

Advantages of Local Setup

  • Full Control: Customize server configuration, add providers, and modify behavior
  • Latest Features: Access development features before they reach Docker releases
  • Multiple Transports: Choose stdio, SSE, or HTTP transport modes
  • Current Tool Surface: Use remember, recall, improve, and forget_memory alongside compatibility tools
  • Development Ready: Debug, modify, and contribute to the codebase

Setup Steps

1

Clone Repository

git clone https://github.com/topoteretes/cognee.git
cd cognee
2

Create Environment File

Create a .env file with your configuration:
LLM_API_KEY="your-openai-api-key"
3

Install Dependencies

# Install uv package manager
brew install uv

# Install project dependencies
cd cognee-mcp
uv sync --dev --all-extras --reinstall
4

Activate and Run

# Run with default stdio transport
uv run cognee-mcp

Running in API Mode

To connect the MCP server to an existing Cognee backend instead of running standalone:
# Start MCP in HTTP or SSE mode pointing to the backend
uv run cognee-mcp --transport http --api-url http://localhost:8080

# Optional: add an auth token if the backend requires it
uv run cognee-mcp --transport http --api-url http://localhost:8080 --api-token your_backend_token
When --api-url is provided, the MCP server acts as an interface to the centralized backend. This allows multiple MCP instances and clients to share the same knowledge graph. You can also pass these as command-line arguments:
uv run cognee-mcp --transport http --api-url http://localhost:8080 --api-token your_token
The source-run uv run cognee-mcp entrypoint reads --api-url and --api-token flags. The API_URL and API_TOKEN environment variables are used by the Docker entrypoint wrapper, not by the source runner directly.
Use cases:
  • Team collaboration with shared memory
  • Multiple AI clients accessing consistent data
  • Centralized knowledge graph management

Further details

Choose the transport mode based on your client requirements:
Default mode for most MCP clients. The client starts the server as a subprocess and communicates through standard input/output.
uv run cognee-mcp
# or equivalently:
uv run cognee-mcp --transport stdio
Configure your MCP client to launch the server directly:
{
  "mcpServers": {
    "cognee": {
      "command": "uv",
      "args": [
        "--directory", "/absolute/path/to/cognee-mcp",
        "run", "cognee-mcp"
      ]
    }
  }
}
Replace /absolute/path/to/cognee-mcp with the actual path to your cloned cognee-mcp directory.
If you encounter errors on first run, reset your MCP configuration and restart.
All available arguments for uv run cognee-mcp:
ArgumentDefaultDescription
--transportstdioTransport protocol: stdio, http, or sse
--host127.0.0.1Host to bind the server to (HTTP/SSE only)
--port8000Port to bind the server to (HTTP/SSE only)
--path/mcpURL path for the HTTP endpoint
--log-levelinfoLog verbosity: debug, info, warning, or error
--no-migrationoffSkip database migrations on startup
--api-urlURL of a running Cognee backend (enables API mode)
--api-tokenAuth token for the backend API (if required)
--serve-urlCognee Cloud or remote instance URL used with cognee.serve()
--serve-api-keyAPI key for the --serve-url instance
Example with all options:
uv run cognee-mcp \
  --transport http \
  --host 0.0.0.0 \
  --port 8000 \
  --api-url http://localhost:8080 \
  --api-token your_token
The preferred MCP workflow is to use remember, recall, improve, and forget_memory. The server also exposes legacy-compatible tools such as cognify and search when you need lower-level control.

Next Steps

After starting the server, configure your AI client to connect to it. See the integrations section for client-specific setup instructions.

Need Help?

Join Our Community

Get support and connect with other developers using Cognee MCP.