Documentation Index
Fetch the complete documentation index at: https://docs.cognee.ai/llms.txt
Use this file to discover all available pages before exploring further.
Observability with Langfuse
Langfuse is an open-source observability and analytics platform for LLM-powered applications. It collects traces, generations, and custom metrics so that you can debug, evaluate, and monitor your AI features in production.Integration inside Cognee
Cognee ships with Langfuse support as an optional dependency. The integration is intentionally lightweight and consists of just a few lines of code. Anywhere in the codebase where we want observability we add:The decorator is a thin wrapper around Langfuse and automatically creates a span every time the function is executed.The data is sent to the Langfuse backend specified via environment variables and can be inspected in the Langfuse UI.
Quick Start
1. Install cognee with monitoring support
Langfuse is included as part of cognee’smonitoring optional dependencies. You need to install cognee with the monitoring extra:
Cognee currently supports
langfuse>=2.32.0,<3.2. Create a Langfuse project
Create a project at Langfuse Cloud and copy the public and secret keys.3. Configure environment variables
Export the following environment variables (for example in your.env file):
BaseConfig reads these values on startup, so nothing else is required. Start Cognee as usual and open the Langfuse UI – traces will appear in real time.
4. Run your regular Cognee workflows
Adding your own spans
You can instrument any function in your codebase:observe() decorator accepts. With Langfuse v2 (cognee[monitoring] pins langfuse>=2.32.0,<3), common options include:
| Kwarg | Description |
|---|---|
name | Override the span name (defaults to the function name) |
as_type | Span type label, e.g. "generation" |
capture_input | Whether to capture function arguments |
capture_output | Whether to capture the return value |
metadata | Arbitrary dict attached to the span |
Some integrations use If you see this error, remove
@observe(workflow=True), but that is not part of Cognee’s built-in Langfuse integration. Cognee passes decorator kwargs through to Langfuse as-is, so unsupported arguments will fail at runtime.workflow=True and use only the arguments supported by your installed Langfuse SDK version.OpenTelemetry tracing
Cognee also ships a standalone OpenTelemetry tracing layer that works independently of Langfuse and can export spans to Grafana, Jaeger, Dash0, and any other OTLP-compatible backend. See the OpenTelemetry Tracing page for setup details.Join Langfuse and cognee communities on Discord for any questions.