Observability with Langfuse
Langfuse is an open-source observability and analytics platform for LLM-powered applications. It collects traces, generations, and custom metrics so that you can debug, evaluate, and monitor your AI features in production.Integration inside Cognee
Cognee ships with Langfuse support out of the box. The integration is intentionally lightweight and consists of just a few lines of code. Anywhere in the codebase where we want observability we add:The decorator is a thin wrapper around Langfuse and automatically creates a span every time the function is executed.The data is sent to the Langfuse backend specified via environment variables and can be inspected in the Langfuse UI.
Quick start
-
Install cognee (it comes with Langfuse)
Langfuse is already declared in
pyproject.toml
, so you can skip this step if you already installed cognee. - Create a project at Langfuse Cloud and copy the public and secret keys.
-
Export the following environment variables (for example in your
.env
file):
BaseConfig
reads these values on startup, so nothing else is required. Start Cognee as usual and open the Langfuse UI – traces will appear in real time.
Run your regular Cognee workflows:
Adding your own spans
You can instrument any function in your codebase:langfuse.decorators.observe
(name
, as_type
, metadata
, …) are forwarded unchanged – see the Langfuse Python SDK docs for the full reference.
Join Langfuse and cognee communities on Discord for any questions.