Observability with Langfuse
Langfuse is an open-source observability and analytics platform for LLM-powered applications. It collects traces, generations, and custom metrics so that you can debug, evaluate, and monitor your AI features in production.Integration inside Cognee
Cognee ships with Langfuse support as an optional dependency. The integration is intentionally lightweight and consists of just a few lines of code. Anywhere in the codebase where we want observability we add:The decorator is a thin wrapper around Langfuse and automatically creates a span every time the function is executed.The data is sent to the Langfuse backend specified via environment variables and can be inspected in the Langfuse UI.
Quick start
1. Install cognee with monitoring support
Langfuse is included as part of cognee’smonitoring optional dependencies. You need to install cognee with the monitoring extra:
Cognee currently supports
langfuse>=2.32.0,<3.2. Create a Langfuse project
Create a project at Langfuse Cloud and copy the public and secret keys.3. Configure environment variables
Export the following environment variables (for example in your.env file):
BaseConfig reads these values on startup, so nothing else is required. Start Cognee as usual and open the Langfuse UI – traces will appear in real time.
4. Run your regular Cognee workflows
Adding your own spans
You can instrument any function in your codebase:langfuse.decorators.observe (name, as_type, metadata, …) are forwarded unchanged – see the Langfuse Python SDK docs for the full reference.
Join Langfuse and cognee communities on Discord for any questions.