Core Infrastructure Components
-
Vector Stores: Cognee supports multiple vector store backends, which handle vector embeddings for semantic search and context retrieval. Supported vector stores include:
- LanceDB (default local vector database)
- Qdrant
- Weaviate
- PGVector (PostgreSQL extension)
- Milvus
-
Graph Databases: Cognee builds a knowledge graph from extracted entities and relationships. Supported graph stores include:
- NetworkX (default in-memory graph)
- Neo4j (for production-scale graph queries and persistence)
-
Relational Databases: Cognee supports:
- SQLite (default local relational database)
- PostgreSQL
-
LLM Providers and Cognitive Services: Cognee leverages LLMs to process, classify, and summarize text. By default, it expects an OpenAI-compatible API key but can also integrate with other providers like Anyscale or Ollama. Configure these in environment variables or
.env
files. -
Visualization: Optionally, integrate Graphistry for advanced graph visualization for NetworkX.
System Requirements
- Python Environment: Cognee supports Python 3.9+.
- Container Runtime: For Docker-based deployments, you’ll need Docker and Docker Compose.
- Node.js & npm: Required if you intend to run the frontend UI locally.
- Database Services: Depending on your chosen backend, ensure you have access to a running PostgreSQL, Neo4j, Qdrant, or Weaviate instance if not using defaults.