CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/npm-sentry--node

Sentry Node SDK using OpenTelemetry for comprehensive error tracking and performance monitoring in Node.js applications

Pending
Overview
Eval results
Files

ai-service-integrations.mddocs/

AI Service Integrations

Instrumentation for AI services including OpenAI, Anthropic, and Vercel AI SDK.

Capabilities

OpenAI Integration

Automatic instrumentation for OpenAI API calls.

/**
 * Create OpenAI integration for automatic API call tracing
 * @param options - OpenAI integration configuration options
 * @returns OpenAI integration instance
 */
function openAIIntegration(options?: OpenAIOptions): Integration;

Usage Examples:

import * as Sentry from "@sentry/node";
import OpenAI from "openai";

// Initialize with OpenAI integration
Sentry.init({
  dsn: "YOUR_DSN",
  integrations: [
    Sentry.openAIIntegration({
      recordInputs: true, // Record input prompts (be mindful of PII)
      recordOutputs: true, // Record AI responses (be mindful of PII)
      enableUsageTracker: true, // Track token usage
    }),
  ],
});

const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
});

// These API calls will create spans
const completion = await openai.chat.completions.create({
  model: "gpt-3.5-turbo",
  messages: [{ role: "user", content: "Hello, world!" }],
});

const embedding = await openai.embeddings.create({
  model: "text-embedding-ada-002",
  input: "Text to embed",
});

Anthropic Integration

Automatic instrumentation for Anthropic Claude API calls.

/**
 * Create Anthropic integration for automatic API call tracing
 * @param options - Anthropic integration configuration options
 * @returns Anthropic integration instance
 */
function anthropicAIIntegration(options?: AnthropicOptions): Integration;

Usage Examples:

import * as Sentry from "@sentry/node";
import Anthropic from "@anthropic-ai/sdk";

// Initialize with Anthropic integration
Sentry.init({
  dsn: "YOUR_DSN",
  integrations: [
    Sentry.anthropicAIIntegration({
      recordInputs: true,
      recordOutputs: true,
      enableUsageTracker: true,
    }),
  ],
});

const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});

// These API calls will create spans
const message = await anthropic.messages.create({
  model: "claude-3-sonnet-20240229",
  max_tokens: 1000,
  messages: [{ role: "user", content: "Hello, Claude!" }],
});

Vercel AI SDK Integration

Automatic instrumentation for Vercel AI SDK operations.

/**
 * Create Vercel AI SDK integration for automatic operation tracing
 * @param options - Vercel AI integration configuration options
 * @returns Vercel AI integration instance
 */
function vercelAIIntegration(options?: VercelAIOptions): Integration;

Usage Examples:

import * as Sentry from "@sentry/node";
import { openai } from "@ai-sdk/openai";
import { generateText, streamText } from "ai";

// Initialize with Vercel AI integration
Sentry.init({
  dsn: "YOUR_DSN",
  integrations: [
    Sentry.vercelAIIntegration({
      recordInputs: true,
      recordOutputs: true,
      recordStreamChunks: false, // Avoid recording streaming chunks for performance
    }),
  ],
});

// These operations will create spans
const { text } = await generateText({
  model: openai("gpt-3.5-turbo"),
  prompt: "Write a short story about a robot.",
});

const { textStream } = await streamText({
  model: openai("gpt-3.5-turbo"),
  prompt: "Explain quantum computing",
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}

Types

Integration Options

interface OpenAIOptions {
  /** Record input prompts in spans (be mindful of PII) */
  recordInputs?: boolean;
  /** Record AI responses in spans (be mindful of PII) */
  recordOutputs?: boolean;
  /** Track token usage metrics */
  enableUsageTracker?: boolean;
  /** Maximum input/output length to record */
  maxDataLength?: number;
}

interface AnthropicOptions {
  /** Record input prompts in spans (be mindful of PII) */
  recordInputs?: boolean;
  /** Record AI responses in spans (be mindful of PII) */
  recordOutputs?: boolean;
  /** Track token usage metrics */
  enableUsageTracker?: boolean;
  /** Maximum input/output length to record */
  maxDataLength?: number;
}

interface VercelAIOptions {
  /** Record input prompts in spans (be mindful of PII) */
  recordInputs?: boolean;
  /** Record AI responses in spans (be mindful of PII) */
  recordOutputs?: boolean;
  /** Record streaming chunks (can impact performance) */
  recordStreamChunks?: boolean;
  /** Maximum input/output length to record */
  maxDataLength?: number;
}

Install with Tessl CLI

npx tessl i tessl/npm-sentry--node

docs

ai-service-integrations.md

context-management.md

database-integrations.md

error-capture.md

feature-flags-integrations.md

framework-integrations.md

index.md

initialization.md

monitoring-sessions.md

nodejs-integrations.md

performance-monitoring.md

tile.json