Langfuse.OpenTelemetry
1.3.0
See the version list below for details.
dotnet add package Langfuse.OpenTelemetry --version 1.3.0
NuGet\Install-Package Langfuse.OpenTelemetry -Version 1.3.0
<PackageReference Include="Langfuse.OpenTelemetry" Version="1.3.0" />
<PackageVersion Include="Langfuse.OpenTelemetry" Version="1.3.0" />
<PackageReference Include="Langfuse.OpenTelemetry" />
paket add Langfuse.OpenTelemetry --version 1.3.0
#r "nuget: Langfuse.OpenTelemetry, 1.3.0"
#:package Langfuse.OpenTelemetry@1.3.0
#addin nuget:?package=Langfuse.OpenTelemetry&version=1.3.0
#tool nuget:?package=Langfuse.OpenTelemetry&version=1.3.0
Langfuse .NET SDK (Unofficial) 🪢
Unofficial .NET SDK for Langfuse - the open-source LLM engineering platform.
Packages
| Package | Description | Install |
|---|---|---|
| Langfuse.OpenTelemetry | Export OTEL traces to Langfuse | dotnet add package Langfuse.OpenTelemetry |
| Langfuse.Client | Prompt management, user feedback, datasets | dotnet add package Langfuse.Client |
| Langfuse.Core | Shared config & types (auto-installed) | dotnet add package Langfuse.Core |
Langfuse.OpenTelemetry
Export .NET OpenTelemetry traces to Langfuse. Works with any OTEL-instrumented library including Semantic Kernel.
Quick Start
1. Install
dotnet add package Langfuse.OpenTelemetry
2. Set environment variables
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_BASE_URL=https://cloud.langfuse.com # EU region (default)
# LANGFUSE_BASE_URL=https://us.cloud.langfuse.com # US region
3. Add to your app
using Langfuse.OpenTelemetry;
using Microsoft.SemanticKernel;
using OpenTelemetry;
using OpenTelemetry.Trace;
// Enable GenAI diagnostics (prompts, tokens, completions)
AppContext.SetSwitch("Microsoft.SemanticKernel.Experimental.GenAI.EnableOTelDiagnosticsSensitive", true);
// Setup OpenTelemetry with Langfuse exporter
using var tracerProvider = Sdk.CreateTracerProviderBuilder()
.AddSource("Microsoft.SemanticKernel*")
.AddLangfuseExporter()
.Build();
// Use Semantic Kernel as normal
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatCompletion("gpt-4o-mini", apiKey)
.Build();
var result = await kernel.InvokePromptAsync("Hello!");
Configuration Options
// Option 1: Environment variables (recommended)
.AddLangfuseExporter()
// Option 2: Manual configuration
.AddLangfuseExporter(options =>
{
options.PublicKey = "pk-lf-...";
options.SecretKey = "sk-lf-...";
options.BaseUrl = "https://cloud.langfuse.com";
})
// Option 3: From IConfiguration (appsettings.json)
.AddLangfuseExporter(configuration)
Langfuse.Client
Access Langfuse features like Prompt Management, User Feedback, and Datasets directly from .NET.
Quick Start
1. Install
dotnet add package Langfuse.Client
2. Set environment variables
LANGFUSE_PUBLIC_KEY=pk-lf-...
LANGFUSE_SECRET_KEY=sk-lf-...
LANGFUSE_BASE_URL=https://cloud.langfuse.com
3. Use the client
using Langfuse.Client;
var client = new LangfuseClient();
// Fetch a text prompt (cached for 60s by default)
var prompt = await client.GetPromptAsync("movie-critic");
// Compile with variables
var compiled = prompt.Compile(new Dictionary<string, string>
{
["criticlevel"] = "expert",
["movie"] = "Dune 2"
});
// -> "As an expert movie critic, do you like Dune 2?"
// Fetch a chat prompt
var chatPrompt = await client.GetChatPromptAsync("movie-critic-chat");
var messages = chatPrompt.Compile(("criticlevel", "expert"), ("movie", "Dune 2"));
// -> [{ role: "system", content: "..." }, { role: "user", content: "..." }]
Features
- Text & Chat prompts - Full support for both prompt types
- Variable compilation -
{{variable}}syntax support - Version/Label selection - Fetch specific versions or labels (production, staging)
- Client-side caching - 60s TTL by default, configurable
- Fallback prompts - Graceful degradation when API fails
- Config access - Access prompt config (model, temperature, etc.)
- User feedback - Create scores and assign them to traces
- Dataset management - Create and manage evaluation datasets
// Get specific version
var v1 = await client.GetPromptAsync("my-prompt", version: 1);
// Get by label
var staging = await client.GetPromptAsync("my-prompt", label: "staging");
// With fallback
var fallback = TextPrompt.CreateFallback("default", "Fallback prompt text");
var prompt = await client.GetPromptAsync("my-prompt", fallback: fallback);
// Access config
var model = prompt.GetConfigValue<string>("model");
var temperature = prompt.GetConfigValue<double>("temperature", 0.7);
// User feedback / scores
await client.CreateScoreAsync("trace-id", "user-feedback", value: true);
await client.CreateScoreAsync("trace-id", "quality", value: 0.95, comment: "Great!");
// Dataset management
var dataset = await client.CreateDatasetAsync(
name: "qa-benchmark",
description: "QA testing dataset"
);
var item = await client.CreateDatasetItemAsync(
datasetName: "qa-benchmark",
input: new { question = "What is Langfuse?" },
expectedOutput: new { answer = "An LLM engineering platform" }
);
var items = await client.GetItemsForDatasetAsync("qa-benchmark");
Documentation
- Testing Guide - How to run tests
- Features - Implemented features with Langfuse docs links
- Contributing - How to contribute
Running the Sample
# Set environment variables
export OPENAI_API_KEY="sk-..."
export LANGFUSE_PUBLIC_KEY="pk-lf-..."
export LANGFUSE_SECRET_KEY="sk-lf-..."
export LANGFUSE_BASE_URL="https://cloud.langfuse.com"
# Run sample
cd samples/SemanticKernel.Sample
dotnet run
Check your Langfuse dashboard to see the traces.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Links
- Langfuse - Open-source LLM engineering platform
- Langfuse Docs - Official documentation
- OpenTelemetry Integration - OTEL docs
- Prompt Management - Prompts docs
- Datasets - Datasets docs
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Langfuse.Core (>= 1.3.0)
- OpenTelemetry (>= 1.9.0)
- OpenTelemetry.Exporter.OpenTelemetryProtocol (>= 1.9.0)
-
net9.0
- Langfuse.Core (>= 1.3.0)
- OpenTelemetry (>= 1.9.0)
- OpenTelemetry.Exporter.OpenTelemetryProtocol (>= 1.9.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.