@observe-decorated function and can export them to any OTLP-compatible backend — Grafana Tempo, Jaeger, Dash0, Datadog, Honeycomb, and others.
How It Works
When tracing is enabled, cognee’s@observe decorator wraps each function in an OTEL span in addition to any LangFuse trace it creates. The two systems are independent:
| Feature | LangFuse | OTEL |
|---|---|---|
| Enabled by | LANGFUSE_PUBLIC_KEY + LANGFUSE_SECRET_KEY | COGNEE_TRACING_ENABLED=true |
| Data destination | Langfuse cloud / self-hosted | Any OTLP backend (or in-memory) |
| Span type | Langfuse generation / observation | OTEL span |
| Can run together? | Yes | Yes |
Installation
OTEL support requires OpenTelemetry dependencies. You can install them with either the dedicatedtracing extra or the broader monitoring extra:
opentelemetry-sdk plus the OTLP gRPC and HTTP exporters. No additional packages are needed for the in-memory buffer.
Quick Start
1. Enable tracing via environment variable
2. Export to an OTLP backend (optional)
Point cognee at an OTLP-compatible collector:Grafana Tempo / Grafana Cloud
Grafana Tempo / Grafana Cloud
Jaeger (local)
Jaeger (local)
4317) exposed.Dash0
Dash0
3. Using an auto-instrumentation agent
If you launch your application withopentelemetry-instrument or an APM agent (Datadog, Dash0, Elastic), it configures its own TracerProvider before your code runs. Cognee detects this and attaches its in-memory exporter to the existing provider instead of creating a new one — so cognee spans appear inside your existing trace alongside other spans from your application.
get_last_trace() and related helpers still work.
Programmatic API
You can also control tracing from Python:CogneeTrace API
| Method | Returns | Description |
|---|---|---|
spans() | list[dict] | Flat list of span dicts sorted by start time |
summary() | dict | Root operation, total duration, per-span breakdown, errors |
tree() | dict | Hierarchical span tree as nested dicts |
name, trace_id, span_id, parent_span_id, start_time_ns, end_time_ns, duration_ms, status, attributes.
Span Attributes
Cognee sets the following semantic attributes on spans:| Attribute | Description |
|---|---|
cognee.span.category | Value of as_type= passed to @observe |
cognee.llm.model | LLM model name |
cognee.llm.provider | LLM provider |
cognee.search.type | Search type enum value |
cognee.search.query | Search query text |
cognee.pipeline.task_name | Pipeline task name |
cognee.vector.collection | Vector collection name |
cognee.db.system | Database backend identifier |
Environment Variables Reference
| Variable | Default | Description |
|---|---|---|
COGNEE_TRACING_ENABLED | false | Set to true to enable OTEL tracing |
OTEL_SERVICE_NAME | cognee | Service name attached to all spans |
OTEL_EXPORTER_OTLP_ENDPOINT | (none) | OTLP collector endpoint. Cognee currently prefers the OTLP gRPC exporter when both gRPC and HTTP exporters are installed. |
OTEL_EXPORTER_OTLP_HEADERS | (none) | Auth headers (e.g. Authorization=Bearer <token>) |
Cognee reads
OTEL_EXPORTER_OTLP_ENDPOINT directly and passes it to the OTLP exporter. Other standard OTEL_EXPORTER_OTLP_* settings such as headers are honored by the underlying exporter library.Cognee currently tries the OTLP gRPC exporter first and only falls back to the OTLP HTTP exporter if the gRPC exporter package is unavailable. Because the shipped extras install both exporters, the practical default is gRPC. Use a gRPC-compatible OTLP endpoint unless you intentionally run with only the HTTP exporter installed.
Using OTEL and LangFuse Together
You can combine both systems simultaneously:- LangFuse captures LLM generation details (prompts, completions, costs, scores).
- OTEL captures end-to-end latency across your entire application with standard span semantics.