or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
npmpkg:npm/langsmith@0.4.x

docs

index.md
tile.json

tessl/npm-langsmith

tessl install tessl/npm-langsmith@0.4.3

TypeScript client SDK for the LangSmith LLM tracing, evaluation, and monitoring platform.

wrappers-anthropic.mddocs/integrations/

Anthropic SDK Wrapper

Automatic tracing for the Anthropic SDK with proper handling of message completions and streaming.

Package Information

  • Package: langsmith (npm)
  • Language: TypeScript
  • Installation: npm install langsmith @anthropic-ai/sdk
  • Module: langsmith/wrappers/anthropic

Overview

The Anthropic wrapper provides specialized tracing for the official Anthropic SDK. Wrap your Anthropic client once and all subsequent API calls are automatically traced to LangSmith.

Core Import

import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import Anthropic from "@anthropic-ai/sdk";

For CommonJS:

const { wrapAnthropic } = require("langsmith/wrappers/anthropic");
const Anthropic = require("@anthropic-ai/sdk");

Basic Usage

import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import Anthropic from "@anthropic-ai/sdk";

// Wrap Anthropic client
const anthropic = wrapAnthropic(new Anthropic(), {
  project_name: "anthropic-project"
});

// All calls automatically traced
const message = await anthropic.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }]
});

Wrap Function

/**
 * Wrap Anthropic SDK for automatic tracing
 * @param anthropic - Anthropic client instance
 * @param options - Wrapper configuration
 * @returns Wrapped Anthropic client
 */
function wrapAnthropic<T>(anthropic: T, options?: Partial<RunTreeConfig>): T;

interface RunTreeConfig {
  /** Run name */
  name?: string;
  /** Project name */
  project_name?: string;
  /** Metadata */
  metadata?: KVMap;
  /** Tags */
  tags?: string[];
  /** LangSmith client */
  client?: Client;
}

Message Completions

import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import Anthropic from "@anthropic-ai/sdk";

const anthropic = wrapAnthropic(new Anthropic(), {
  project_name: "chat-app"
});

// Standard message
const message = await anthropic.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }]
});

console.log(message.content[0].text);

// With system message
const response = await anthropic.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  system: "You are a coding assistant.",
  messages: [{ role: "user", content: "Explain async/await" }]
});

Streaming

import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import Anthropic from "@anthropic-ai/sdk";

const anthropic = wrapAnthropic(new Anthropic(), {
  project_name: "streaming-chat"
});

// Stream message
const messageStream = anthropic.messages.stream({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Tell me a story" }]
});

// Process stream
for await (const event of messageStream) {
  if (event.type === "content_block_delta" && event.delta.type === "text_delta") {
    process.stdout.write(event.delta.text);
  }
}

// Get final message
const finalMessage = await messageStream.finalMessage();

Advanced Configuration

import { wrapAnthropic } from "langsmith/wrappers/anthropic";
import { Client } from "langsmith";
import Anthropic from "@anthropic-ai/sdk";

const lsClient = new Client({
  apiKey: process.env.LANGSMITH_API_KEY
});

const anthropic = wrapAnthropic(
  new Anthropic({
    apiKey: process.env.ANTHROPIC_API_KEY
  }),
  {
    client: lsClient,
    project_name: "production-chat",
    name: "customer-support",
    metadata: {
      version: "2.1.0",
      deployment: "us-east-1"
    },
    tags: ["anthropic", "claude", "production"]
  }
);

Traced Information

The wrapper automatically captures:

  • Inputs: Model name, messages, system prompts, parameters
  • Outputs: Generated messages, stop reasons, tool use
  • Metadata: Message IDs, model versions, latency
  • Token Usage: Input tokens, output tokens, total tokens
  • Errors: API errors, rate limits, timeouts
  • Streaming: Individual events and final aggregated message

Combining with Traceable

import { traceable } from "langsmith/traceable";
import { wrapAnthropic } from "langsmith/wrappers/anthropic";

const anthropic = wrapAnthropic(new Anthropic());

const processQuery = traceable(
  async (query: string) => {
    // Anthropic call traced as child
    const message = await anthropic.messages.create({
      model: "claude-sonnet-4-20250514",
      max_tokens: 1024,
      messages: [{ role: "user", content: query }]
    });

    return message.content[0].text;
  },
  { name: "processQuery", run_type: "chain" }
);

// Creates nested trace
await processQuery("What is AI?");

Best Practices

Reuse Wrapped Clients

// Good: Create once
const anthropic = wrapAnthropic(new Anthropic(), {
  project_name: "my-app"
});

// Reuse in functions
async function chat(message: string) {
  return anthropic.messages.create({
    model: "claude-sonnet-4-20250514",
    max_tokens: 1024,
    messages: [{ role: "user", content: message }]
  });
}

Error Handling

const anthropic = wrapAnthropic(new Anthropic());

try {
  const message = await anthropic.messages.create({
    model: "claude-sonnet-4-20250514",
    max_tokens: 1024,
    messages: [{ role: "user", content: "Hello" }]
  });
} catch (error) {
  // Error automatically logged to trace
  console.error("Anthropic call failed:", error);
}

Related Documentation

  • OpenAI Wrapper - OpenAI SDK wrapper
  • Vercel Integration - Vercel AI SDK wrapper
  • Tracing Guide - Core tracing concepts
  • Client API - Client configuration