or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/langsmith@0.4.x

docs

index.md
tile.json

tessl/npm-langsmith

tessl install tessl/npm-langsmith@0.4.3

TypeScript client SDK for the LangSmith LLM tracing, evaluation, and monitoring platform.

index.mddocs/integrations/

Integrations

Framework integrations and SDK wrappers for automatic tracing with popular AI libraries and testing frameworks.

Overview

LangSmith provides native integrations with popular AI SDKs and testing frameworks, enabling automatic tracing with minimal code changes.

Quick Navigation

SDK Wrappers

  • Generic Wrapper - Wrap any SDK
  • OpenAI - OpenAI SDK wrapper
  • Anthropic - Anthropic SDK wrapper
  • Vercel AI SDK - Vercel AI SDK integration

Framework Integrations

  • LangChain - LangChain callbacks and wrappers
  • Jest - Test-driven evaluation with Jest
  • Vitest - Test-driven evaluation with Vitest

SDK Wrappers

Wrap AI SDK clients to automatically trace all API calls with full input/output capture, token usage, and error tracking.

Generic SDK Wrapper

📖 Full Documentation

Wrap any SDK or object for automatic tracing.

import { wrapSDK } from "langsmith/wrappers";

const wrappedSDK = wrapSDK(mySDK, {
  name: "custom-sdk",
  projectName: "my-project",
  tags: ["production"],
});

// All method calls automatically traced
await wrappedSDK.processData("input");

Use Cases:

  • Custom or internal SDKs
  • APIs without specialized wrappers
  • Any object with methods to trace

Features:

  • ✓ Automatic method interception
  • ✓ Async/sync support
  • ✓ Type safety preserved
  • ✓ Error capture
  • ✓ Return value logging

Related: Wrappers Overview

OpenAI SDK Wrapper

📖 Full Documentation

Specialized wrapper for OpenAI SDK with proper handling of chat completions, streaming, embeddings, and function calling.

import { wrapOpenAI } from "langsmith/wrappers/openai";
import OpenAI from "openai";

const openai = wrapOpenAI(new OpenAI(), {
  projectName: "openai-project",
});

// Automatically traced
const response = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: "Hello!" }],
});

Automatically Captures:

  • ✓ Model name and parameters
  • ✓ Messages and prompts
  • ✓ Completions and finish reasons
  • ✓ Token usage (input, output, total)
  • ✓ Streaming chunks
  • ✓ Function/tool calls
  • ✓ API errors and rate limits

Installation:

npm install langsmith openai

Related: OpenAI DocumentationBest Practices

Anthropic SDK Wrapper

📖 Full Documentation

Specialized wrapper for Anthropic SDK with proper handling of message completions, streaming, and tool use.

import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import Anthropic from "@anthropic-ai/sdk";

const anthropic = wrapAnthropic(new Anthropic(), {
  project_name: "anthropic-project",
});

// Automatically traced
const message = await anthropic.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});

Automatically Captures:

  • ✓ Model name and parameters
  • ✓ Messages and system prompts
  • ✓ Response content
  • ✓ Token usage
  • ✓ Streaming events
  • ✓ Tool use
  • ✓ Stop reasons

Installation:

npm install langsmith @anthropic-ai/sdk

Related: Anthropic Documentation

Vercel AI SDK Integration

📖 Full Documentation

Native integration with Vercel AI SDK for automatic tracing of generateText, streamText, generateObject, and streamObject.

import { wrapAISDK } from "langsmith/experimental/vercel";
import { wrapLanguageModel, generateText } from "ai";
import { openai } from "@ai-sdk/openai";

// Wrap AI SDK (wrapLanguageModel is REQUIRED)
const wrappedAI = wrapAISDK({ wrapLanguageModel, generateText }, {
  project_name: "vercel-project",
});

// Automatically traced
const { text } = await wrappedAI.generateText({
  model: openai("gpt-4"),
  prompt: "What is LangSmith?",
});

Automatically Captures:

  • ✓ Model configuration
  • ✓ Prompts and messages
  • ✓ Generated text/objects
  • ✓ Token usage
  • ✓ Streaming chunks
  • ✓ Tool calls and results
  • ✓ Response metadata

Installation:

npm install langsmith ai @ai-sdk/openai

Advanced Features:

  • Runtime configuration override
  • Input/output processing for privacy
  • Response metadata tracing
  • Multi-provider support

Related: Vercel DocumentationRuntime Config

Framework Integrations

LangChain Integration

📖 Full Documentation

Seamless integration with LangChain framework using callbacks and runnable wrappers.

import { getLangchainCallbacks } from "langsmith/langchain";
import { ChatOpenAI } from "@langchain/openai";

const llm = new ChatOpenAI({
  callbacks: await getLangchainCallbacks(),
});

// Automatically traced
const response = await llm.invoke("Hello!");

Features:

  • ✓ Callback handlers for automatic tracing
  • ✓ Runnable wrappers
  • ✓ Chain tracing
  • ✓ Agent tracing
  • ✓ Tool tracing

Related: LangChain Documentation

Jest Integration

📖 Full Documentation

Test-driven evaluation with Jest framework integration.

import { test, expect, wrapEvaluator } from "langsmith/jest";

test("chatbot responds correctly", async () => {
  const response = await chatbot("What is 2+2?");

  expect(response).toContain("4");
}, wrapEvaluator({
  datasetName: "math-qa",
}));

Features:

  • ✓ Test each example as separate test case
  • ✓ Automatic dataset integration
  • ✓ Custom evaluators
  • ✓ Jest matchers
  • ✓ Parallel test execution

Installation:

npm install --save-dev langsmith jest

Related: Jest DocumentationTesting Guide

Vitest Integration

📖 Full Documentation

Test-driven evaluation with Vitest framework integration.

import { test, expect, wrapEvaluator } from "langsmith/vitest";

test("chatbot responds correctly", async () => {
  const response = await chatbot("What is 2+2?");

  expect(response).toContain("4");
}, wrapEvaluator({
  datasetName: "math-qa",
}));

Features:

  • ✓ Test each example as separate test case
  • ✓ Automatic dataset integration
  • ✓ Custom evaluators
  • ✓ Vitest matchers
  • ✓ Fast execution with Vite

Installation:

npm install --save-dev langsmith vitest

Related: Vitest DocumentationTesting Guide

Comparison: Which Integration?

For AI SDK Tracing

SDKUseWrapperBest For
OpenAIOfficial OpenAI SDKwrapOpenAI()Chat completions, embeddings, streaming
AnthropicOfficial Anthropic SDKwrapAnthropic()Claude models, tool use
Vercel AIVercel AI SDKwrapAISDK()Multi-provider, unified API
LangChainLangChain frameworkCallbacksChains, agents, complex workflows
CustomAny other SDK/APIwrapSDK()Internal APIs, custom clients

For Testing

FrameworkUseIntegrationBest For
JestPopular test frameworklangsmith/jestReact, Node.js projects
VitestModern test frameworklangsmith/vitestVite, fast execution
ManualDirect Client APIevaluate()Custom test harnesses

Common Patterns

Multiple Providers

import { wrapOpenAI } from "langsmith/wrappers/openai";
import { wrapAnthropic } from "langsmith/wrappers/anthropic";

// Wrap different providers
const openai = wrapOpenAI(new OpenAI(), {
  projectName: "multi-provider",
  tags: ["openai"],
});

const anthropic = wrapAnthropic(new Anthropic(), {
  project_name: "multi-provider",
  tags: ["anthropic"],
});

// Use interchangeably
const gptResponse = await openai.chat.completions.create({...});
const claudeResponse = await anthropic.messages.create({...});

Conditional Wrapping

import { wrapOpenAI } from "langsmith/wrappers/openai";
import OpenAI from "openai";

// Only wrap in production
const openai = process.env.NODE_ENV === "production"
  ? wrapOpenAI(new OpenAI(), { projectName: "prod" })
  : new OpenAI();

Nested Tracing

import { traceable } from "langsmith/traceable";
import { wrapOpenAI } from "langsmith/wrappers/openai";

const openai = wrapOpenAI(new OpenAI());

// Wrap business logic
const processQuery = traceable(
  async (query: string) => {
    // OpenAI call traced as child
    const response = await openai.chat.completions.create({
      model: "gpt-4",
      messages: [{ role: "user", content: query }],
    });

    return response.choices[0].message.content;
  },
  { name: "process-query", run_type: "chain" }
);

// Creates nested trace: process-query > openai.chat.completions.create

Best Practices

SDK Wrappers

Do:

  • Wrap SDK clients once at initialization
  • Use descriptive project names
  • Add meaningful metadata and tags
  • Reuse wrapped clients

Don't:

  • Create new wrappers on every call
  • Wrap the same client multiple times
  • Use generic names like "api" or "wrapper"

Testing Integrations

Do:

  • Create versioned datasets
  • Use descriptive test names
  • Run tests in CI/CD
  • Store evaluation results

Don't:

  • Mix unit tests with evaluations
  • Hard-code dataset names
  • Skip test setup/teardown

Configuration

Do:

  • Use environment variables for API keys
  • Set sampling rate for high-volume production
  • Configure privacy settings (hideInputs/Outputs)
  • Use different projects for dev/staging/prod

Don't:

  • Hard-code API keys
  • Trace 100% in high-volume production
  • Log sensitive data without redaction

Related Documentation

Core Features

Advanced

API Reference