QUICKSTART
This guide will help you get started with Cognee quickly and efficiently. Follow these step-by-step instructions to set up your environment, install Cognee, and run your first knowledge graph operations.
1. Preparation
Cognee runs on Python versions 3.9 to 3.12, make sure you have a suitable setup.
Before running Cognee you have to specify up your environment. This is easiest done editing the .env
file in the directory you are working in.
Cognee relies on third-party LLM providers and you have a great choice of them you can use in your workflow.
The simple way
Just provide your OpenAI API key if you already have one. This will help you both with LLM and embeddings.
echo 'LLM_API_KEY="your_api_key"' > .env
The free way
If you don’t have an OpenAI API key and you would like to try Cognee with free services, register for an OpenRouter account and get a free API key. OpenRouter does not host embedding providers, so you have to fall back to a local way to do that. Your .env
should look something like this then:
LLM_API_KEY="your_api_key"
LLM_PROVIDER="custom"
LLM_MODEL="openrouter/google/gemini-2.0-flash-thinking-exp-1219:free"
LLM_ENDPOINT="https://openrouter.ai/api/v1"
EMBEDDING_PROVIDER="fastembed"
EMBEDDING_MODE="sentence-transformers/all-MiniLM-L6-v2"
EMBEDDING_DIMENSIONS=384
EMBEDDING_MAX_TOKENS=256
2. Install cognee
In this example we will use Poetry to install Cognee. You can also install it with pip
or uv
.
poetry init
poetry add cognee
# poetry add fastembed ## if you are going the free way
Cognee stores data in a relational database, in a vector database and in a graph database. We provide you with a variety of possibilities for each store, so you can choose the right one for your purposes. To see more check out our infrastructure) page.
For this simple example we will use our default set of SQLite, LanceDB and NetworkX that doesn’t require extra servers or registration at third parties.
3. Basic usage
This minimal example shows how to add content, process it, and perform a search:
import cognee
import asyncio
async def main():
# Add sample content
text = "Natural language processing (NLP) is a subfield of computer science."
await cognee.add(text)
# Process with LLMs to build the knowledge graph
await cognee.cognify()
# Search the knowledge graph
results = await cognee.search(
query_text="Tell me about NLP"
)
for result in results:
print(result)
if __name__ == '__main__':
asyncio.run(main())
4. Further resources
- If you want to try cognee in an interactive notebook environment check out our Cognee GraphRAG Simple Example Colab
- You can also try our Dockerized server via using the API.
Join the Conversation!
Have questions? Join our community now to connect with professionals, share insights, and get your questions answered!