Infrastructure

System Requirements

  • Python Environment: Cognee supports Python 3.9+
  • Node.js & npm: Required if you intend to run the frontend UI locally

Core Infrastructure Components

Cogneeโ€™s infrastructure spans two repositories:

๐Ÿ›๏ธ Cognee Core - Essential Components

The main repository includes widely-used, stable databases that come with pip install cognee:
  • Core vector stores: LanceDB (default), Qdrant
  • Core graph databases: Kuzu(default), Neo4j,
  • Core relational databases: SQLite (default), PostgreSQL

๐Ÿš€ Cognee Community - Extended Components

The community repository provides additional database adapters and experimental features:
  • Additional vector stores: Qdrant, Azure AI Search, Redis
  • Additional graph databases: Community-contributed adapters
  • Experimental features: Cutting-edge integrations and plugins
Installation:
  • Core: pip install cognee[neo4j,postgres] (for optional core dependencies)
  • Community: pip install cognee-community-vector-adapter-qdrant (example community package)
Configure these in environment variables or .env files. For more details about the configurations, refer here.

Vector Stores

Cognee supports multiple vector store backends for semantic search and context retrieval: Core Repository:
  • LanceDB (default local vector database)
  • PGVector (PostgreSQL extension)
Community Repository:
  • Qdrant (cognee-community-vector-adapter-qdrant)
  • Azure AI Search (cognee-community-vector-adapter-azure)
  • Redis (cognee-community-vector-adapter-redis)

Graph Databases

Cognee builds knowledge graphs from extracted entities and relationships: Core Repository:
  • Kuzu (performant graph database)
  • Neo4j (industry-standard graph database)
Community Repository:
  • Additional graph database adapters available through community contributions

Relational Databases

Cognee supports:
  • SQLite (default local relational database)
  • PostgreSQL

LLM Providers and Cognitive Services

Cognee leverages LLMs to process, classify, and summarize text. By default, it expects an OpenAI-compatible API key but can also integrate with other providers. Remote Providers:
  • OpenAI
  • Anthropic
  • Gemini
  • Anyscale
  • OpenRouter
Local Providers:
  • Ollama (recommended: phi4, llama3.3 70b-instruct-q3_K_M, deepseek-r1:32b)

Visualization

We provide you with default visualization but you can also use:
  • Graphistry for advanced graph visualization with NetworkX
  • Neo4j browser interface
  • Kuzu native browser interface (also integrates with GDotV and yWorks)

Telemetry

Cognee includes robust telemetry support to help you monitor, analyze, and improve your workflows. Telemetry tools provide detailed insights into system performance, user interactions, and data flows, enabling continuous optimization.

Current Support

Langfuse: Integration with Langfuse enables real-time monitoring and logging for AI-powered workflows, helping you trace and debug pipeline performance effectively.

Upcoming Support

Langraph: Planned integration with Langraph will add advanced graph-based telemetry capabilities, allowing for more detailed visualization and analysis of system behaviors and relationships.
Stay tuned for updates as we expand telemetry support to provide even greater observability and insights into your cognee-powered applications.