Cursor Integration
If you’re looking to unify your code exploration in Python repositories, Cursor provides an intuitive interface to interact with cognee’s MCP server directly.
If you are using Visual Studio Code, you can explore our Roo or Cline integration guides instead.
By integrating Cursor and cognee, you can effortlessly:
- Generate knowledge graphs from your codebase
- Access code search capabilities
- Explore advanced analysis tools without leaving your IDE
Let’s quickly set up cognee’s MCP server in Cursor, ensuring you can query your codebase and retrieve insights.
Why Use Cognee with Cursor?
Cognee specializes in building detailed knowledge graphs and retrieve accurate data based on user queries. Together with Cursor, you get:
- Streamlined Development Experience: Interact with cognee’s code analysis directly from Cursor’s Composer.
- Enhanced Code Understanding: Find out dependencies and relationships in your code base easily, and quickly search across large codebases.
- Efficiency Gains: No need to switch between terminals or external apps—simply invoke cognee’s powerful tools from within your IDE.
Prerequisites
Before proceeding, ensure you have the following:
- Cursor installed on your machine.
- A local copy of the cognee repository.
- An LLM API key (default setup uses OpenAI, e.g.,
sk-...
).
Integration Steps
1. Install Cursor
- Visit the official Cursor website to download Cursor.
- Follow the on-screen instructions to install Cursor on your operating system.
- Once installed, open Cursor to verify everything is functioning properly.
2. Clone and Navigate to Cognee
Open your terminal and run:
git clone https://www.github.com/topoteretes/cognee
cd cognee/cognee-mcp
3. Install Dependencies
Cognee uses the uv
package manager for setup and runtime:
-
If you’re on macOS, install
uv
via Homebrew:brew install uv
-
Move to the
cognee-mcp
directory and sync dependencies:uv sync --reinstall
This process ensures cognee’s environment is fully configured and ready to run.
4. Create a Run Script for Cognee
- Inside your preferred scripts directory (e.g., in your project directory), create a file named
run-cognee.sh
. - Make it executable and add the following contents:
#!/bin/bash
export ENV=local
export TOKENIZERS_PARALLELISM=false
export EMBEDDING_PROVIDER = "fastembed"
export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
export EMBEDDING_DIMENSIONS= 384
export EMBEDDING_MAX_TOKENS-256
export LLM_API_KEY=your-API-key
uv --directory /{cognee_root_path}/cognee-mcp run cognee
Remember:
- Update
/{cognee_root_path}/cognee-mcp
to the full path of your cloned cognee repo.- Replace
your-API-key
with your actual API key.
5. Add a New MCP Server in Cursor
- Launch Cursor, and click the Gear Icon to open Settings.
- Navigate to Features > MCP.
- Click on + Add New MCP Server.
In the Add MCP Server modal, set the following:
-
Type:
Stdio
-
Name:
Cognee
(or any nickname you prefer) -
Command: Point this to your script:
sh /{script_root_path}/run-cognee.sh
Save this configuration. You should see your new entry in the MCP settings list.
6. Refresh and Verify Cognee in Cursor
- In the MCP settings, locate your newly added cognee server.
- Click the Refresh button (often in the top-right corner of the server’s card) to have Cursor attempt to connect.
If all goes well, you should see a list of available tools from cognee (e.g., codify). This indicates cognee’s MCP server is running correctly, and Cursor has successfully loaded the server’s capabilities.
7. Use Cognee in Cursor’s Composer
-
Open the Composer in Cursor.
-
Make sure Agent and not Ask is selected.
-
Issue a prompt referencing cognee tools. Cursor will pass your request to the cognee MCP server, and results will be displayed directly in the Composer. For instance:
Remember: Use the
CODE
search type to query your code graph. Tip: For larger codebases, consider incremental indexing or caching to speed up analysis.
You’ve now set up cognee’s MCP server with Cursor! Enjoy a richer, more powerful code exploration experience right inside your IDE.
Join the Conversation!
Have questions or feedback? Join our community to connect with professionals, share insights, and get answers to your questions!