Install and configure Langfuse SDK authentication for LLM observability. Use when setting up a new Langfuse integration, configuring API keys, or initializing Langfuse tracing in your project. Trigger with phrases like "install langfuse", "setup langfuse", "langfuse auth", "configure langfuse API key", "langfuse tracing setup".
89
88%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Passed
No known issues
Install the Langfuse SDK and configure authentication for LLM observability. Covers both the legacy langfuse package (v3) and the modern modular SDK (v4+/v5) built on OpenTelemetry.
pk-lf-...) and Secret Key (sk-lf-...) from project settingsTypeScript/JavaScript (v4+ modular SDK -- recommended):
set -euo pipefail
# Core client for prompt management, datasets, scores
npm install @langfuse/client
# Tracing (observe, startActiveObservation)
npm install @langfuse/tracing @langfuse/otel @opentelemetry/sdk-node
# OpenAI integration (drop-in wrapper)
npm install @langfuse/openai
# LangChain integration
npm install @langfuse/langchainTypeScript/JavaScript (v3 legacy -- single package):
npm install langfusePython:
pip install langfusepk-lf-... (identifies your project)sk-lf-... (grants write access -- keep secret)https://cloud.langfuse.com)# Set environment variables
export LANGFUSE_PUBLIC_KEY="pk-lf-..."
export LANGFUSE_SECRET_KEY="sk-lf-..."
export LANGFUSE_BASE_URL="https://cloud.langfuse.com"
# Or create .env file
cat >> .env << 'EOF'
LANGFUSE_PUBLIC_KEY=pk-lf-your-public-key
LANGFUSE_SECRET_KEY=sk-lf-your-secret-key
LANGFUSE_BASE_URL=https://cloud.langfuse.com
EOFNote: v4+ uses
LANGFUSE_BASE_URL. Legacy v3 usesLANGFUSE_HOSTorLANGFUSE_BASEURL.
// src/lib/langfuse.ts
import { LangfuseClient } from "@langfuse/client";
import { startActiveObservation } from "@langfuse/tracing";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { NodeSDK } from "@opentelemetry/sdk-node";
// 1. Register the OpenTelemetry span processor (once at app startup)
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();
// 2. Create the Langfuse client for prompt/dataset/score operations
export const langfuse = new LangfuseClient({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_BASE_URL,
});
// 3. Verify connection
async function verify() {
await startActiveObservation("connection-test", async (span) => {
span.update({ input: { test: true } });
span.update({ output: { status: "connected" } });
});
console.log("Langfuse connection verified. Check dashboard for trace.");
}
verify();import { Langfuse } from "langfuse";
const langfuse = new Langfuse({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_HOST,
});
// Verify with a test trace
const trace = langfuse.trace({
name: "connection-test",
metadata: { test: true },
});
await langfuse.flushAsync();
console.log("Connected. Trace URL:", trace.getTraceUrl());
// Clean shutdown
process.on("beforeExit", async () => {
await langfuse.shutdownAsync();
});from langfuse import Langfuse
import os
langfuse = Langfuse(
public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
secret_key=os.environ["LANGFUSE_SECRET_KEY"],
host=os.environ.get("LANGFUSE_HOST", "https://cloud.langfuse.com"),
)
# Test trace
trace = langfuse.trace(name="connection-test", metadata={"test": True})
langfuse.flush()
print(f"Connected. Trace: {trace.get_trace_url()}")| Feature | v3 (langfuse) | v4+ (@langfuse/*) |
|---|---|---|
| Package | Single langfuse | Modular: @langfuse/client, @langfuse/tracing, @langfuse/otel |
| Base URL env var | LANGFUSE_HOST | LANGFUSE_BASE_URL |
| Tracing | langfuse.trace() | startActiveObservation() / observe() |
| Client class | Langfuse | LangfuseClient |
| OpenAI wrapper | observeOpenAI() from langfuse | observeOpenAI() from @langfuse/openai |
| Foundation | Custom | OpenTelemetry |
| Error | Cause | Solution |
|---|---|---|
401 Unauthorized | Invalid or expired API key | Re-check keys in Langfuse dashboard Settings > API Keys |
ECONNREFUSED | Wrong host URL or server down | Verify LANGFUSE_BASE_URL / LANGFUSE_HOST |
Missing required configuration | Env vars not loaded | Ensure dotenv/config imported at entry point |
Module not found | Package not installed | Run npm install or pip install again |
| Using pk- key as secret | Keys swapped | Public key starts pk-lf-, secret starts sk-lf- |
After auth is working, proceed to langfuse-hello-world for your first traced LLM call.
c8a915c
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.