Continue Integration
Cognee enables Continue to use knowledge graphs to index and retrieve code parts for better LLM context.
What is Continue?
Continue is a leading open-source AI code assistant. It allows you to connect any model and any context provider to create custom autocomplete and chat experience inside your IDE.
How to integrate Cognee into Continue?
In order to integrate Cognee we need to change a config file inside Continue plugin.
- Open the Continue extension window inside your IDE
- Locate the “Open Continue Config” button (cog icon) and click on it (Type it in VSCode Command Palette for quick access)
- Under
contextProviders
add the newhttp
provider
{
...
"contextProviders": {
...
{
"name": "http",
"params": {
"url": "http://localhost:8000/api/v1/code-pipeline/retrieve",
"title": "cognee",
"description": "Cognee code context retrieval",
"displayTitle": "cognee"
}
},
},
}
Run Cognee and use from Continue
Clone and run Cognee
- Clone the cognee repo and open it
- Install dependencies by running
poetry install
- Activate virtual environment by running
source .venv/bin/activate
- Start the Cognee API by running
python cognee/api/client.py
Use Cognee in Continue
- Open the Continue extension window
- Type
@cognee
followed by your prompt - Cognee will return the relevant context back to Continue for further processing