All integrations are designed to be lightweight and easy to configure, requiring minimal setup to get started.
Observability & Monitoring
Track performance, debug issues, and monitor your knowledge graph operations in production.Langfuse
Open-source LLM observabilityDistributed tracing and metrics for every Cognee task with granular pipeline performance insights.
Keywords AI
LLM application tracingSpan-level tracing across tasks and workflows with minimal code using Cognee’s observe abstraction.
Evaluation & Testing
Measure and improve the quality of your knowledge graph outputs with comprehensive evaluation frameworks.DeepEval
Comprehensive RAG evaluationRun QA & RAG metrics including Contextual Relevancy, Precision/Recall, and Coverage using LLM-as-a-judge workflows.
Cloud LLM Providers
Connect to enterprise-grade LLM services through Cognee’s flexible integration layer.AWS Bedrock
Enterprise LLM accessUse AWS Bedrock models including Claude, Titan, and Cohere through LiteLLM proxy integration.
Data Ingestion
Scrape, extract, and ingest web content directly into Cognee’s knowledge graph.ScrapeGraphAI
Web scraping to knowledge graphScrape URLs with natural language prompts and ingest the results directly into cognee.
Agent Frameworks
Build stateful AI agents with persistent semantic memory that endures across sessions.LangGraph
Works with
create_react_agent—no manual state management.Google ADK
Native
LongRunningFunctionTool for async-first Gemini agents.OpenClaw
Auto-index and recall for your personal AI agent.
n8n
No-code memory workflows with all n8n integrations.
Claude Agent SDK
Native MCP server tools for Claude’s tool protocol.
OpenAI Agent SDK
Native
function_tool pattern with agent handoffs support.Agent IDEs & Development Tools
Integrate Cognee directly into your development workflow with MCP-compatible tools and AI assistants.MCP Integrations
MCP Integrations
Cursor
AI-powered code editor integration
Continue
VS Code AI assistant plugin
Claude Code
Anthropic’s Claude integration
Cline
Command-line AI assistant
Roo Code
Advanced coding assistant
Cognee’s Model Context Protocol (MCP) adapter enables seamless integration with AI assistants, providing direct access to your knowledge graphs for in-context assistance.
Contributing Integrations
Don’t see your favorite tool? Cognee is open and extensible—help us grow the ecosystem!All community database integrations are maintained in the cognee-community repository, keeping the core Cognee package lean while providing extensibility.