LangFuse Integration

How to connect LangFuse to Brief for LLM observability insights. Give Brief visibility into your AI product's traces, prompts, and performance metrics.

Last updated: March 17, 2026

Brief + LangFuse gives your team visibility into how your AI product is performing. If you're building with LLMs, LangFuse tracks your traces, prompts, and costs. Connect it to Brief so your product decisions are informed by real usage data.

Without LangFuseWith LangFuse Connected
LLM performance data siloedAI sees your traces and metrics
Prompt effectiveness unknownAI knows which prompts work
Cost and latency invisibleDecisions informed by real data

What data does Brief access from LangFuse?

  • Traces — End-to-end request traces through your LLM application
  • Observations — Individual LLM calls, spans, and events within traces
  • Metrics — Latency, token usage, costs, and custom scores
  • Prompts — Your prompt templates and their performance
Brief only requests read access to your LangFuse project. Brief cannot modify traces, prompts, or any other data in your LangFuse account.

Why should I connect LangFuse to Brief?

Understand Your AI Product's Performance

When making product decisions about your AI features, Brief can reference actual performance data — which prompts work best, where latency spikes, and what's costing the most.

Debug Issues Faster

Brief can search your traces to help you understand failures, edge cases, and unexpected behavior in your LLM application.

Optimize Prompts and Costs

With visibility into token usage and costs, Brief can help you make informed decisions about prompt optimization and model selection.

How do I set up the LangFuse integration?

  1. Go to Integrations in Brief
  2. Find LangFuse and click Connect
  3. Get your API keys from your LangFuse project settings (Settings → API Keys)
  4. Enter your Secret Key (starts with sk-lf-...)
  5. Enter your Public Key (starts with pk-lf-...)
  6. Optionally enter your Host if using a self-hosted or regional instance
  7. Click Save
Setup time: 2-3 minutes
What you'll need: Secret Key and Public Key from LangFuse
Host: Use https://us.cloud.langfuse.com for US region, or leave blank for EU (default)

What can I ask Brief with LangFuse connected?

Once connected, try asking Brief:

  • "Show me recent traces with errors"
  • "What's our average latency for the chat endpoint?"
  • "Which prompts have the highest token usage?"
  • "Find traces where users complained about slow responses"
  • "What's our LLM cost trend this week?"

Common Issues

Why does the connection fail?

Double-check that you've entered both the Secret Key and Public Key correctly. They're different keys — the Secret Key starts with sk-lf- and the Public Key starts with pk-lf-.

Why can't Brief see my traces?

Make sure the API keys you provided have access to the project you want to query. LangFuse supports multiple projects — ensure you're using keys from the right one.

I'm using self-hosted LangFuse

Enter your self-hosted instance URL in the Host field (e.g., https://langfuse.yourcompany.com). Make sure your instance is accessible from Brief's servers.

What's Next?

Now that LangFuse is connected: