Skip to Content
ReferenceInfrastructure

Infrastructure

System Requirements

  • Python Environment: Cognee supports Python 3.9+
  • Node.js & npm: Required if you intend to run the frontend UI locally

Core Infrastructure Components

While default options come directly with pip install cognee, dependencies for other databases should be installed from the extras. For example, if you choose to run cognee with neo4j, pgvector, and postgres, this would look like pip install cognee[neo4j,postgres].

Configure these in environment variables or .env files. For more details about the configurations, refer here.

Vector Stores

Cognee supports multiple vector store backends, which handle vector embeddings for semantic search and context retrieval:

  • LanceDB (default local vector database)
  • PGVector (PostgreSQL extension)
  • Qdrant
  • Weaviate
  • Milvus

Graph Databases

Cognee builds a knowledge graph from extracted entities and relationships:

  • NetworkX (default in-memory graph database)
  • Kuzu (performant graph database)
  • Neo4j

Relational Databases

Cognee supports:

  • SQLite (default local relational database)
  • PostgreSQL

LLM Providers and Cognitive Services

Cognee leverages LLMs to process, classify, and summarize text. By default, it expects an OpenAI-compatible API key but can also integrate with other providers.

Remote Providers:

  • OpenAI
  • Anthropic
  • Gemini
  • Anyscale
  • OpenRouter

Local Providers:

  • Ollama (recommended: phi4, llama3.3 70b-instruct-q3_K_M, deepseek-r1:32b)

Visualization

We provide you with default visualization but you can also use:

  • Graphistry for advanced graph visualization with NetworkX
  • Neo4j browser interface
  • Kuzu native browser interface (also integrates with GDotV and yWorks)

Telemetry

Cognee includes robust telemetry support to help you monitor, analyze, and improve your workflows. Telemetry tools provide detailed insights into system performance, user interactions, and data flows, enabling continuous optimization.

Current Support

Langfuse: Integration with Langfuse enables real-time monitoring and logging for AI-powered workflows, helping you trace and debug pipeline performance effectively.

Upcoming Support

Langraph: Planned integration with Langraph will add advanced graph-based telemetry capabilities, allowing for more detailed visualization and analysis of system behaviors and relationships.

Stay tuned for updates as we expand telemetry support to provide even greater observability and insights into your cognee-powered applications.