or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

ai-service-integrations.mdcontext-management.mddatabase-integrations.mderror-capture.mdfeature-flags-integrations.mdframework-integrations.mdindex.mdinitialization.mdmonitoring-sessions.mdnodejs-integrations.mdperformance-monitoring.md

ai-service-integrations.mddocs/

0

# AI Service Integrations

1

2

Instrumentation for AI services including OpenAI, Anthropic, and Vercel AI SDK.

3

4

## Capabilities

5

6

### OpenAI Integration

7

8

Automatic instrumentation for OpenAI API calls.

9

10

```typescript { .api }

11

/**

12

* Create OpenAI integration for automatic API call tracing

13

* @param options - OpenAI integration configuration options

14

* @returns OpenAI integration instance

15

*/

16

function openAIIntegration(options?: OpenAIOptions): Integration;

17

```

18

19

**Usage Examples:**

20

21

```typescript

22

import * as Sentry from "@sentry/node";

23

import OpenAI from "openai";

24

25

// Initialize with OpenAI integration

26

Sentry.init({

27

dsn: "YOUR_DSN",

28

integrations: [

29

Sentry.openAIIntegration({

30

recordInputs: true, // Record input prompts (be mindful of PII)

31

recordOutputs: true, // Record AI responses (be mindful of PII)

32

enableUsageTracker: true, // Track token usage

33

}),

34

],

35

});

36

37

const openai = new OpenAI({

38

apiKey: process.env.OPENAI_API_KEY,

39

});

40

41

// These API calls will create spans

42

const completion = await openai.chat.completions.create({

43

model: "gpt-3.5-turbo",

44

messages: [{ role: "user", content: "Hello, world!" }],

45

});

46

47

const embedding = await openai.embeddings.create({

48

model: "text-embedding-ada-002",

49

input: "Text to embed",

50

});

51

```

52

53

### Anthropic Integration

54

55

Automatic instrumentation for Anthropic Claude API calls.

56

57

```typescript { .api }

58

/**

59

* Create Anthropic integration for automatic API call tracing

60

* @param options - Anthropic integration configuration options

61

* @returns Anthropic integration instance

62

*/

63

function anthropicAIIntegration(options?: AnthropicOptions): Integration;

64

```

65

66

**Usage Examples:**

67

68

```typescript

69

import * as Sentry from "@sentry/node";

70

import Anthropic from "@anthropic-ai/sdk";

71

72

// Initialize with Anthropic integration

73

Sentry.init({

74

dsn: "YOUR_DSN",

75

integrations: [

76

Sentry.anthropicAIIntegration({

77

recordInputs: true,

78

recordOutputs: true,

79

enableUsageTracker: true,

80

}),

81

],

82

});

83

84

const anthropic = new Anthropic({

85

apiKey: process.env.ANTHROPIC_API_KEY,

86

});

87

88

// These API calls will create spans

89

const message = await anthropic.messages.create({

90

model: "claude-3-sonnet-20240229",

91

max_tokens: 1000,

92

messages: [{ role: "user", content: "Hello, Claude!" }],

93

});

94

```

95

96

### Vercel AI SDK Integration

97

98

Automatic instrumentation for Vercel AI SDK operations.

99

100

```typescript { .api }

101

/**

102

* Create Vercel AI SDK integration for automatic operation tracing

103

* @param options - Vercel AI integration configuration options

104

* @returns Vercel AI integration instance

105

*/

106

function vercelAIIntegration(options?: VercelAIOptions): Integration;

107

```

108

109

**Usage Examples:**

110

111

```typescript

112

import * as Sentry from "@sentry/node";

113

import { openai } from "@ai-sdk/openai";

114

import { generateText, streamText } from "ai";

115

116

// Initialize with Vercel AI integration

117

Sentry.init({

118

dsn: "YOUR_DSN",

119

integrations: [

120

Sentry.vercelAIIntegration({

121

recordInputs: true,

122

recordOutputs: true,

123

recordStreamChunks: false, // Avoid recording streaming chunks for performance

124

}),

125

],

126

});

127

128

// These operations will create spans

129

const { text } = await generateText({

130

model: openai("gpt-3.5-turbo"),

131

prompt: "Write a short story about a robot.",

132

});

133

134

const { textStream } = await streamText({

135

model: openai("gpt-3.5-turbo"),

136

prompt: "Explain quantum computing",

137

});

138

139

for await (const textPart of textStream) {

140

process.stdout.write(textPart);

141

}

142

```

143

144

## Types

145

146

### Integration Options

147

148

```typescript { .api }

149

interface OpenAIOptions {

150

/** Record input prompts in spans (be mindful of PII) */

151

recordInputs?: boolean;

152

/** Record AI responses in spans (be mindful of PII) */

153

recordOutputs?: boolean;

154

/** Track token usage metrics */

155

enableUsageTracker?: boolean;

156

/** Maximum input/output length to record */

157

maxDataLength?: number;

158

}

159

160

interface AnthropicOptions {

161

/** Record input prompts in spans (be mindful of PII) */

162

recordInputs?: boolean;

163

/** Record AI responses in spans (be mindful of PII) */

164

recordOutputs?: boolean;

165

/** Track token usage metrics */

166

enableUsageTracker?: boolean;

167

/** Maximum input/output length to record */

168

maxDataLength?: number;

169

}

170

171

interface VercelAIOptions {

172

/** Record input prompts in spans (be mindful of PII) */

173

recordInputs?: boolean;

174

/** Record AI responses in spans (be mindful of PII) */

175

recordOutputs?: boolean;

176

/** Record streaming chunks (can impact performance) */

177

recordStreamChunks?: boolean;

178

/** Maximum input/output length to record */

179

maxDataLength?: number;

180

}

181

```