To use an LLM, you must provide a configuration to customize its usage. If no configuration is supplied, a default configuration will be applied, and OpenAI will be used as the LLM provider.For a comprehensive list of available parameters for LLM configuration, please refer to Configuration.
# Example unstructured outputresponse = await cognee.search( "How is AI transforming healthcare?", query_type=SearchType.GRAPH_COMPLETION)# Returns natural language:# "AI is transforming healthcare through machine learning algorithms # that analyze medical images, predict patient outcomes..."
import os# Use different models for different tasksos.environ["LLM_PROVIDER"] = "openai"os.environ["LLM_MODEL"] = "gpt-4" # For complex reasoningos.environ["EMBEDDING_PROVIDER"] = "openai"os.environ["EMBEDDING_MODEL"] = "text-embedding-3-small" # For embeddings# Alternative: Use local embeddings to reduce costs# os.environ["EMBEDDING_PROVIDER"] = "fastembed"