Skip to main content
Cognee ships a built-in OpenTelemetry (OTEL) tracing layer that works independently of, and optionally alongside, LangFuse. It creates OTEL spans for every @observe-decorated function and can export them to any OTLP-compatible backend — Grafana Tempo, Jaeger, Dash0, Datadog, Honeycomb, and others.

How It Works

When tracing is enabled, cognee’s @observe decorator wraps each function in an OTEL span in addition to any LangFuse trace it creates. The two systems are independent:
FeatureLangFuseOTEL
Enabled byLANGFUSE_PUBLIC_KEY + LANGFUSE_SECRET_KEYCOGNEE_TRACING_ENABLED=true
Data destinationLangfuse cloud / self-hostedAny OTLP backend (or in-memory)
Span typeLangfuse generation / observationOTEL span
Can run together?YesYes
When both are active, the OTEL span wraps the outer function call while LangFuse traces the inner LLM interactions.

Installation

OTEL support requires OpenTelemetry dependencies. You can install them with either the dedicated tracing extra or the broader monitoring extra:
pip install 'cognee[tracing]'
# or, if you are already using Langfuse / monitoring support:
pip install 'cognee[monitoring]'
Both extras install opentelemetry-sdk plus the OTLP gRPC and HTTP exporters. No additional packages are needed for the in-memory buffer.

Quick Start

1. Enable tracing via environment variable

COGNEE_TRACING_ENABLED=true
That’s all that is required to activate in-memory span collection. Spans are buffered in a ring buffer (last 50 traces) and can be read programmatically.

2. Export to an OTLP backend (optional)

Point cognee at an OTLP-compatible collector:
COGNEE_TRACING_ENABLED=true
OTEL_EXPORTER_OTLP_ENDPOINT=https://your-collector:4317
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Bearer <token>
OTEL_SERVICE_NAME=my-cognee-service   # default: "cognee"
COGNEE_TRACING_ENABLED=true
OTEL_EXPORTER_OTLP_ENDPOINT=https://tempo-us-central1.grafana.net:443
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Basic <base64(instanceId:token)>
OTEL_SERVICE_NAME=cognee
COGNEE_TRACING_ENABLED=true
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
OTEL_SERVICE_NAME=cognee
Start Jaeger with the all-in-one image and its OTLP gRPC port (4317) exposed.
COGNEE_TRACING_ENABLED=true
OTEL_EXPORTER_OTLP_ENDPOINT=https://ingress.eu-west-1.aws.dash0.com:4317
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Bearer <dash0_auth_token>
OTEL_SERVICE_NAME=cognee

3. Using an auto-instrumentation agent

If you launch your application with opentelemetry-instrument or an APM agent (Datadog, Dash0, Elastic), it configures its own TracerProvider before your code runs. Cognee detects this and attaches its in-memory exporter to the existing provider instead of creating a new one — so cognee spans appear inside your existing trace alongside other spans from your application.
opentelemetry-instrument python my_app.py
No additional Cognee configuration is required beyond enabling tracing. Your external agent or APM still needs its own exporter configuration if you want spans sent to a remote backend; in this mode Cognee attaches its in-memory exporter to the existing provider so get_last_trace() and related helpers still work.

Programmatic API

You can also control tracing from Python:
from cognee.modules.observability.trace_context import (
    enable_tracing,
    disable_tracing,
    is_tracing_enabled,
    get_last_trace,
    get_all_traces,
    clear_traces,
)

# Enable with optional console output for debugging
enable_tracing(console_output=True)

# ... run cognee operations ...

trace = get_last_trace()
if trace:
    print(trace.summary())
    # {'operation': 'cognee.observe.main', 'total_duration_ms': 1234.5,
    #  'span_count': 12, 'breakdown': {...}, 'errors': []}

    for span in trace.spans():
        print(span["name"], span["duration_ms"])

disable_tracing()

CogneeTrace API

MethodReturnsDescription
spans()list[dict]Flat list of span dicts sorted by start time
summary()dictRoot operation, total duration, per-span breakdown, errors
tree()dictHierarchical span tree as nested dicts
Each span dict contains: name, trace_id, span_id, parent_span_id, start_time_ns, end_time_ns, duration_ms, status, attributes.

Span Attributes

Cognee sets the following semantic attributes on spans:
AttributeDescription
cognee.span.categoryValue of as_type= passed to @observe
cognee.llm.modelLLM model name
cognee.llm.providerLLM provider
cognee.search.typeSearch type enum value
cognee.search.querySearch query text
cognee.pipeline.task_namePipeline task name
cognee.vector.collectionVector collection name
cognee.db.systemDatabase backend identifier

Environment Variables Reference

VariableDefaultDescription
COGNEE_TRACING_ENABLEDfalseSet to true to enable OTEL tracing
OTEL_SERVICE_NAMEcogneeService name attached to all spans
OTEL_EXPORTER_OTLP_ENDPOINT(none)OTLP collector endpoint. Cognee currently prefers the OTLP gRPC exporter when both gRPC and HTTP exporters are installed.
OTEL_EXPORTER_OTLP_HEADERS(none)Auth headers (e.g. Authorization=Bearer <token>)
Cognee reads OTEL_EXPORTER_OTLP_ENDPOINT directly and passes it to the OTLP exporter. Other standard OTEL_EXPORTER_OTLP_* settings such as headers are honored by the underlying exporter library.
Cognee currently tries the OTLP gRPC exporter first and only falls back to the OTLP HTTP exporter if the gRPC exporter package is unavailable. Because the shipped extras install both exporters, the practical default is gRPC. Use a gRPC-compatible OTLP endpoint unless you intentionally run with only the HTTP exporter installed.

Using OTEL and LangFuse Together

You can combine both systems simultaneously:
# LangFuse
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_HOST=https://cloud.langfuse.com

# OTEL (sent to a separate backend)
COGNEE_TRACING_ENABLED=true
OTEL_EXPORTER_OTLP_ENDPOINT=https://your-otel-backend:4317
With both active:
  • LangFuse captures LLM generation details (prompts, completions, costs, scores).
  • OTEL captures end-to-end latency across your entire application with standard span semantics.