tessl install tessl/npm-langsmith@0.4.3TypeScript client SDK for the LangSmith LLM tracing, evaluation, and monitoring platform.
Framework integrations and SDK wrappers for automatic tracing with popular AI libraries and testing frameworks.
LangSmith provides native integrations with popular AI SDKs and testing frameworks, enabling automatic tracing with minimal code changes.
Wrap AI SDK clients to automatically trace all API calls with full input/output capture, token usage, and error tracking.
Wrap any SDK or object for automatic tracing.
import { wrapSDK } from "langsmith/wrappers";
const wrappedSDK = wrapSDK(mySDK, {
name: "custom-sdk",
projectName: "my-project",
tags: ["production"],
});
// All method calls automatically traced
await wrappedSDK.processData("input");Use Cases:
Features:
Related: Wrappers Overview
Specialized wrapper for OpenAI SDK with proper handling of chat completions, streaming, embeddings, and function calling.
import { wrapOpenAI } from "langsmith/wrappers/openai";
import OpenAI from "openai";
const openai = wrapOpenAI(new OpenAI(), {
projectName: "openai-project",
});
// Automatically traced
const response = await openai.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Hello!" }],
});Automatically Captures:
Installation:
npm install langsmith openaiRelated: OpenAI Documentation • Best Practices
Specialized wrapper for Anthropic SDK with proper handling of message completions, streaming, and tool use.
import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import Anthropic from "@anthropic-ai/sdk";
const anthropic = wrapAnthropic(new Anthropic(), {
project_name: "anthropic-project",
});
// Automatically traced
const message = await anthropic.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello!" }],
});Automatically Captures:
Installation:
npm install langsmith @anthropic-ai/sdkRelated: Anthropic Documentation
Native integration with Vercel AI SDK for automatic tracing of generateText, streamText, generateObject, and streamObject.
import { wrapAISDK } from "langsmith/experimental/vercel";
import { wrapLanguageModel, generateText } from "ai";
import { openai } from "@ai-sdk/openai";
// Wrap AI SDK (wrapLanguageModel is REQUIRED)
const wrappedAI = wrapAISDK({ wrapLanguageModel, generateText }, {
project_name: "vercel-project",
});
// Automatically traced
const { text } = await wrappedAI.generateText({
model: openai("gpt-4"),
prompt: "What is LangSmith?",
});Automatically Captures:
Installation:
npm install langsmith ai @ai-sdk/openaiAdvanced Features:
Related: Vercel Documentation • Runtime Config
Seamless integration with LangChain framework using callbacks and runnable wrappers.
import { getLangchainCallbacks } from "langsmith/langchain";
import { ChatOpenAI } from "@langchain/openai";
const llm = new ChatOpenAI({
callbacks: await getLangchainCallbacks(),
});
// Automatically traced
const response = await llm.invoke("Hello!");Features:
Related: LangChain Documentation
Test-driven evaluation with Jest framework integration.
import { test, expect, wrapEvaluator } from "langsmith/jest";
test("chatbot responds correctly", async () => {
const response = await chatbot("What is 2+2?");
expect(response).toContain("4");
}, wrapEvaluator({
datasetName: "math-qa",
}));Features:
Installation:
npm install --save-dev langsmith jestRelated: Jest Documentation • Testing Guide
Test-driven evaluation with Vitest framework integration.
import { test, expect, wrapEvaluator } from "langsmith/vitest";
test("chatbot responds correctly", async () => {
const response = await chatbot("What is 2+2?");
expect(response).toContain("4");
}, wrapEvaluator({
datasetName: "math-qa",
}));Features:
Installation:
npm install --save-dev langsmith vitestRelated: Vitest Documentation • Testing Guide
| SDK | Use | Wrapper | Best For |
|---|---|---|---|
| OpenAI | Official OpenAI SDK | wrapOpenAI() | Chat completions, embeddings, streaming |
| Anthropic | Official Anthropic SDK | wrapAnthropic() | Claude models, tool use |
| Vercel AI | Vercel AI SDK | wrapAISDK() | Multi-provider, unified API |
| LangChain | LangChain framework | Callbacks | Chains, agents, complex workflows |
| Custom | Any other SDK/API | wrapSDK() | Internal APIs, custom clients |
| Framework | Use | Integration | Best For |
|---|---|---|---|
| Jest | Popular test framework | langsmith/jest | React, Node.js projects |
| Vitest | Modern test framework | langsmith/vitest | Vite, fast execution |
| Manual | Direct Client API | evaluate() | Custom test harnesses |
import { wrapOpenAI } from "langsmith/wrappers/openai";
import { wrapAnthropic } from "langsmith/wrappers/anthropic";
// Wrap different providers
const openai = wrapOpenAI(new OpenAI(), {
projectName: "multi-provider",
tags: ["openai"],
});
const anthropic = wrapAnthropic(new Anthropic(), {
project_name: "multi-provider",
tags: ["anthropic"],
});
// Use interchangeably
const gptResponse = await openai.chat.completions.create({...});
const claudeResponse = await anthropic.messages.create({...});import { wrapOpenAI } from "langsmith/wrappers/openai";
import OpenAI from "openai";
// Only wrap in production
const openai = process.env.NODE_ENV === "production"
? wrapOpenAI(new OpenAI(), { projectName: "prod" })
: new OpenAI();import { traceable } from "langsmith/traceable";
import { wrapOpenAI } from "langsmith/wrappers/openai";
const openai = wrapOpenAI(new OpenAI());
// Wrap business logic
const processQuery = traceable(
async (query: string) => {
// OpenAI call traced as child
const response = await openai.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: query }],
});
return response.choices[0].message.content;
},
{ name: "process-query", run_type: "chain" }
);
// Creates nested trace: process-query > openai.chat.completions.create✅ Do:
❌ Don't:
✅ Do:
❌ Don't:
✅ Do:
❌ Don't:
traceable()