A minimal guide to using cognee.search() to ask questions against your processed datasets. This guide shows the basic call and what each parameter does so you know which knob to turn. Before you start:
  • Complete Quickstart to understand basic operations
  • Ensure you have LLM Providers configured for LLM-backed search types
  • Run cognee.cognify(...) to build the graph before searching
  • Keep at least one dataset with read permission for the user running the search

Code in Action

import asyncio
import cognee

async def main():
    # Make sure you've already run cognee.cognify(...) so the graph has content
    answers = await cognee.search(
        query_text="What are the main themes in my data?"
    )
    for answer in answers:
        print(answer)

asyncio.run(main())
SearchType.GRAPH_COMPLETION is the default, so you get an LLM-backed answer plus supporting context as soon as you have data in your graph.

What Just Happened

The search call uses the default SearchType.GRAPH_COMPLETION mode to provide LLM-backed answers with supporting context from your knowledge graph. The results are returned as a list that you can iterate through and process as needed.

Parameters Reference

Most examples below assume you are inside an async function. Import helpers when you need them:
from cognee import SearchType
from cognee.modules.engine.models.node_set import NodeSet

Core Parameters