Cognee MCP works with AI development tools that support the Model Context Protocol. These clients connect to your Cognee MCP server and provide access to memory management, code intelligence, and data operations through their interfaces. The client configuration is the same whether your MCP server runs in standalone mode or connects to a centralized Cognee backend. The server architecture is transparent to clients.Documentation Index
Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt
Use this file to discover all available pages before exploring further.
Available Clients
Cursor
AI-powered code editor with native MCP support
Claude Code
Command-line AI assistant from Anthropic
Codex
OpenAI coding agent with MCP support
Cline
VS Code extension for AI-assisted development
Continue
Open-source AI coding assistant for VS Code and JetBrains
Roo Code
VS Code extension for AI-powered development
Python Agent
Connect your own Python LLM agent via stdio or HTTP
Goose
Open-source AI coding assistant by Block