Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Langfuse

Langfuse is an open-source LLM observability and analytics platform. This integration records Synaptic run events (LLM calls, tool invocations, chain steps) as Langfuse traces for debugging, cost monitoring, and quality evaluation.

Setup

[dependencies]
synaptic = { version = "0.4", features = ["langfuse"] }

Sign up at cloud.langfuse.com or self-host.

Configuration

use synaptic::langfuse::{LangfuseCallback, LangfuseConfig};

let config = LangfuseConfig::new("pk-lf-...", "sk-lf-...");
let callback = LangfuseCallback::new(config).await.unwrap();

Self-Hosted Instance

let config = LangfuseConfig::new("pk-lf-...", "sk-lf-...")
    .with_host("https://langfuse.your-company.com")
    .with_flush_batch_size(50);

Usage

use synaptic::langfuse::{LangfuseCallback, LangfuseConfig};
use std::sync::Arc;

let config = LangfuseConfig::new("pk-lf-...", "sk-lf-...");
let callback = Arc::new(LangfuseCallback::new(config).await.unwrap());
// Events are buffered and auto-flushed when batch_size is reached.
// At application shutdown, flush remaining events:
callback.flush().await.unwrap();

Configuration Reference

FieldDefaultDescription
public_keyrequiredLangfuse public key
secret_keyrequiredLangfuse secret key
hosthttps://cloud.langfuse.comLangfuse host URL
flush_batch_size20Events buffered before auto-flush