Skip to main content
Vector stores hold embeddings for semantic similarity search. They enable Cognee to find conceptually related content based on meaning rather than exact text matches.
New to configuration?See the Setup Configuration Overview for the complete workflow:install extras → create .env → choose providers → handle pruning.

Supported Providers

Cognee supports multiple vector store options:
  • LanceDB — File-based vector store, works out of the box (default)
  • PGVector — Postgres-backed vector storage with pgvector extension
  • Qdrant — High-performance vector database and similarity search engine
  • ChromaDB — HTTP server-based vector database
  • FalkorDB — Hybrid graph + vector database
  • Neptune Analytics — Amazon Neptune Analytics hybrid solution

Configuration

Set these environment variables in your .env file:
  • VECTOR_DB_PROVIDER — The vector store provider (lancedb, pgvector, chromadb, falkordb, neptune_analytics)
  • VECTOR_DB_URL — Database URL or connection string
  • VECTOR_DB_KEY — Authentication key (provider-specific)
  • VECTOR_DB_PORT — Database port (for some providers)

Setup Guides

LanceDB is file-based and requires no additional setup. It’s perfect for local development and single-user scenarios.
VECTOR_DB_PROVIDER="lancedb"
# Optional, can be a path or URL. Defaults to <SYSTEM_ROOT_DIRECTORY>/databases/cognee.lancedb
# VECTOR_DB_URL=/absolute/or/relative/path/to/cognee.lancedb
Installation: LanceDB is included by default with Cognee. No additional installation required.Data Location: Vectors are stored in a local directory. Defaults under the Cognee system path if VECTOR_DB_URL is empty.
PGVector stores vectors inside your Postgres database using the pgvector extension.
VECTOR_DB_PROVIDER="pgvector"
# Uses the same Postgres connection as your relational DB (DB_HOST, DB_PORT, DB_NAME, DB_USERNAME, DB_PASSWORD)
Installation: Install the Postgres extras:
pip install "cognee[postgres]"
# or for binary version
pip install "cognee[postgres-binary]"
Docker Setup: Use the built-in Postgres with pgvector:
docker compose --profile postgres up -d
Note: If using your own Postgres, ensure CREATE EXTENSION IF NOT EXISTS vector; is available in the target database.
Qdrant requires a running instance of the Qdrant server.
VECTOR_DB_PROVIDER="qdrant"
VECTOR_DB_URL="http://localhost:6333"
Installation: Since Qdrant is a community adapter, you have to install the community package:
pip install cognee-community-vector-adapter-qdrant
Configuration: To make sure Cognee uses Qdrant, you have to register it beforehand with the following line:
from cognee_community_vector_adapter_qdrant import register
For more details on setting up Qdrant, visit the more detailed description of this adapter.Docker Setup: Start the Qdrant service:
docker run -p 6333:6333 -p 6334:6334 \
    -v "$(pwd)/qdrant_storage:/qdrant/storage:z" \
    qdrant/qdrant
Access: Default port is 6333 for the database, and you can access the Qdrant dashboard at “localhost:6333/dashboard”.
ChromaDB requires a running Chroma server and authentication token.
VECTOR_DB_PROVIDER="chromadb"
VECTOR_DB_URL="http://localhost:3002"
VECTOR_DB_KEY="<your_token>"
Installation: Install ChromaDB extras:
pip install "cognee[chromadb]"
# or directly
pip install chromadb
Docker Setup: Start the bundled ChromaDB server:
docker compose --profile chromadb up -d
FalkorDB can serve as both graph and vector store, providing a hybrid solution.
VECTOR_DB_PROVIDER="falkordb"
VECTOR_DB_URL="localhost"
VECTOR_DB_PORT="6379"
Installation: Since FalkorDB is a community adapter, you have to install the community package:
pip install cognee-community-hybrid-adapter-falkor
Configuration: To make sure Cognee uses FalkorDB, you have to register it beforehand with the following line:
from cognee_community_hybrid_adapter_falkor import register
For more details on setting up FalkorDB, visit the more detailed description of this adapter.Docker Setup: Start the FalkorDB service:
docker run -p 6379:6379 -p 3000:3000 -it --rm falkordb/falkordb:edge
Access: Default ports are 6379 (DB) and 3000 (UI).
Use Amazon Neptune Analytics as a hybrid vector + graph backend.
VECTOR_DB_PROVIDER="neptune_analytics"
VECTOR_DB_URL="neptune-graph://<GRAPH_ID>"
# AWS credentials via environment or default SDK chain
Installation: Install Neptune extras:
pip install "cognee[neptune]"
Note: URL must start with neptune-graph:// and AWS credentials should be configured via environment variables or AWS SDK.

Important Considerations

Ensure EMBEDDING_DIMENSIONS matches your vector store collection/table schemas:
  • PGVector column size
  • LanceDB Vector size
  • ChromaDB collection schema
Changing dimensions requires recreating collections.
ProviderSetupPerformanceUse Case
LanceDBZero setupGoodLocal development
PGVectorPostgres requiredExcellentProduction with Postgres
ChromaDBServer requiredGoodDedicated vector store
FalkorDBServer requiredGoodHybrid graph + vector
Neptune AnalyticsAWS requiredExcellentCloud hybrid solution

Community-Maintained Providers

Additional vector stores are available through community-maintained adapters:
  • Qdrant — Vector search engine with cloud and self-hosted options\
  • FalkorDB — Hybrid vector and graph store
  • Milvus, Pinecone, Weaviate, Redis, and more — See all community adapters

Notes

  • Embedding Integration: Vector stores use your embedding engine from the Embeddings section
  • Dimension Matching: Keep EMBEDDING_DIMENSIONS consistent between embedding provider and vector store
  • Performance: Local providers (LanceDB) are simpler but cloud providers offer better scalability