Continue Integration
Cognee enables Continue to use knowledge graphs to index and retrieve code parts for better LLM context.
What is Continue?
Continue is a leading open-source AI code assistant. It allows you to connect any model and any context provider to create custom autocomplete and chat experience inside your IDE.
How to integrate Cognee into Continue?
In order to integrate Cognee we need to change a config file inside Continue plugin.
- Open the Continue extension window inside your IDE
- Locate the “Open Continue Config” button (cog icon) and click on it (Type it in VSCode Command Palette for quick access)
- Under
contextProviders
add the newhttp
provider
{
...
"contextProviders": {
...
{
"name": "http",
"params": {
"url": "http://localhost:8000/api/v1/code-pipeline/retrieve",
"title": "cognee",
"description": "Cognee code context retrieval",
"displayTitle": "cognee"
}
},
},
}
- Under experimental add new server:
"experimental": {
"modelContextProtocolServers": [
{
"transport": {
"type": "stdio",
"command": "uv",
"args": [
"--directory",
"{path_to_cognee_repo}/cognee-mcp",
"run",
"cognee"
]
}
}
]
}
Run Cognee and use from Continue
Clone and run Cognee
At this point we need to run cognee in order to enable continue to connect to cognee memory
- Clone the cognee repo from Github:
git clone git@github.com/topoteretes/cognee.git
Or
git clone https://github.com/topoteretes/cognee.git
- Navigate to cognee folder:
cd cognee
- Install dependencies by running
poetry install -E codegraph
- Activate virtual environment by running
source .venv/bin/activate
- Start the Cognee API by running
python cognee/api/client.py
Use Cognee in Continue
- Open the Continue extension window
- Index your repo by typing
codify /path/to/repo/you/want/to/index
- Use cognee context retrieval by typing @cognee followed by your prompt cognee will return the relevant context back to Continue for further processing