Infrastructure
System Requirements
- Python Environment: Cognee supports Python 3.9+
- Node.js & npm: Required if you intend to run the frontend UI locally
Core Infrastructure Components
Configure these in environment variables or .env
files.
Vector Stores
Cognee supports multiple vector store backends, which handle vector embeddings for semantic search and context retrieval:
- LanceDB (default local vector database)
- PGVector (PostgreSQL extension)
- Qdrant
- Weaviate
- Milvus
Graph Databases
Cognee builds a knowledge graph from extracted entities and relationships:
- NetworkX (default in-memory graph database)
- Kuzu (performant in-memory graph database)
- Neo4j
Relational Databases
Cognee supports:
- SQLite (default local relational database)
- PostgreSQL
LLM Providers and Cognitive Services
Cognee leverages LLMs to process, classify, and summarize text. By default, it expects an OpenAI-compatible API key but can also integrate with other providers.
Remote Providers:
- OpenAI
- Anthropic
- Gemini
- Anyscale
- Openrouter
Local Providers:
- Ollama (recommended: phi4, llama3.3 70b-instruct-q3_K_M, deepseek-r1:32b)
Visualization
We provide you with default visualization but you can also use:
- Graphistry for advanced graph visualization with NetworkX
- Neo4j browser interface
- Kuzu native interface
Telemetry
Cognee includes robust telemetry support to help you monitor, analyze, and improve your workflows. Telemetry tools provide detailed insights into system performance, user interactions, and data flows, enabling continuous optimization.
Current Support
Langfuse: Integration with Langfuse enables real-time monitoring and logging for AI-powered workflows, helping you trace and debug pipeline performance effectively.
Upcoming Support
Langraph: Planned integration with Langraph will add advanced graph-based telemetry capabilities, allowing for more detailed visualization and analysis of system behaviors and relationships.
Stay tuned for updates as we expand telemetry support to provide even greater observability and insights into your cognee-powered applications.