or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

events.mdindex.mdinstrumentation.mdmetrics.mdutilities.md

events.mddocs/

0

# Event System

1

2

Comprehensive event models and emission functions for capturing AI model interactions as structured OpenTelemetry events. The event system provides detailed visibility into input messages and completion responses when semantic conventions are enabled.

3

4

## Capabilities

5

6

### Event Data Models

7

8

Structured data models representing AI model interactions, following OpenTelemetry semantic conventions for AI observability.

9

10

```python { .api }

11

@dataclass

12

class MessageEvent:

13

"""

14

Represents input messages sent to AI models.

15

16

Used for capturing user prompts, system messages, and tool interactions

17

in a structured format compatible with OpenTelemetry event logging.

18

"""

19

20

content: Any

21

"""Message content (text, structured data, or tool inputs)"""

22

23

role: str = "user"

24

"""Message role: 'user', 'assistant', 'system', or 'tool'"""

25

26

tool_calls: List[ToolCall] | None = None

27

"""Optional list of tool/function calls associated with this message"""

28

29

30

@dataclass

31

class ChoiceEvent:

32

"""

33

Represents AI model completion responses.

34

35

Captures model-generated content, completion metadata, and any

36

tool calls made by the model during response generation.

37

"""

38

39

index: int

40

"""Choice index in multi-choice responses (typically 0 for single responses)"""

41

42

message: CompletionMessage

43

"""The completion message content and metadata"""

44

45

finish_reason: str = "unknown"

46

"""Reason completion finished: 'stop', 'length', 'tool_calls', etc."""

47

48

tool_calls: List[ToolCall] | None = None

49

"""Optional list of tool/function calls made by the model"""

50

```

51

52

### Type Definitions

53

54

Supporting type definitions for event data structures.

55

56

```python { .api }

57

class _FunctionToolCall(TypedDict):

58

"""

59

Represents function call details in tool invocations.

60

61

Contains the function name and arguments for AI model

62

tool calling capabilities.

63

"""

64

65

function_name: str

66

"""Name of the function to call"""

67

68

arguments: Optional[dict[str, Any]]

69

"""Function arguments as key-value pairs"""

70

71

72

class ToolCall(TypedDict):

73

"""

74

Represents a tool or function call in AI model interactions.

75

76

Used to capture when models invoke external tools or functions

77

as part of their response generation process.

78

"""

79

80

id: str

81

"""Unique identifier for this tool call"""

82

83

function: _FunctionToolCall

84

"""Function call details including name and arguments"""

85

86

type: Literal["function"]

87

"""Type of tool call (currently only 'function' is supported)"""

88

89

90

class CompletionMessage(TypedDict):

91

"""

92

Represents the structure of completion messages from AI models.

93

94

Contains the actual response content and metadata about

95

the model's role in the conversation. Note: TypedDict cannot

96

have default values, so 'role' must be provided explicitly.

97

"""

98

99

content: Any

100

"""Message content (text or structured response)"""

101

102

role: str

103

"""Message role (typically 'assistant' for model responses)"""

104

105

106

class Roles(Enum):

107

"""

108

Valid message roles for AI model interactions.

109

110

Defines the standard roles used in conversational AI systems

111

following common industry conventions.

112

"""

113

114

USER = "user"

115

"""Human user input messages"""

116

117

ASSISTANT = "assistant"

118

"""AI model response messages"""

119

120

SYSTEM = "system"

121

"""System-level instructions and context"""

122

123

TOOL = "tool"

124

"""Tool or function execution results"""

125

```

126

127

### Event Emission Functions

128

129

Functions for emitting structured events to OpenTelemetry event loggers, providing comprehensive visibility into AI model interactions.

130

131

```python { .api }

132

def emit_message_events(event_logger, kwargs) -> None:

133

"""

134

Emit input message events to OpenTelemetry event logger.

135

136

Extracts and emits structured events for all input messages

137

sent to AI models, including user prompts and system messages.

138

139

Parameters:

140

- event_logger: OpenTelemetry EventLogger instance

141

- kwargs: Request parameters containing input messages

142

"""

143

144

145

def emit_choice_events(event_logger, response) -> None:

146

"""

147

Emit choice/completion events to OpenTelemetry event logger.

148

149

Extracts and emits structured events for AI model responses,

150

including generated content and completion metadata.

151

152

Parameters:

153

- event_logger: OpenTelemetry EventLogger instance

154

- response: Model response containing completion choices

155

"""

156

157

158

def emit_input_events_converse(kwargs, event_logger) -> None:

159

"""

160

Emit input events for Bedrock converse API calls.

161

162

Specialized event emission for the modern Bedrock converse API,

163

handling the conversation format and message structure.

164

165

Parameters:

166

- kwargs: Converse API request parameters

167

- event_logger: OpenTelemetry EventLogger instance

168

"""

169

170

171

def emit_response_event_converse(response, event_logger) -> None:

172

"""

173

Emit response events for Bedrock converse API responses.

174

175

Handles response event emission for the converse API format,

176

including message content and conversation metadata.

177

178

Parameters:

179

- response: Converse API response object

180

- event_logger: OpenTelemetry EventLogger instance

181

"""

182

183

184

def emit_streaming_response_event(response_body, event_logger) -> None:

185

"""

186

Emit events for streaming model responses.

187

188

Processes and emits events for streaming responses from

189

invoke_model_with_response_stream calls.

190

191

Parameters:

192

- response_body: Accumulated streaming response content

193

- event_logger: OpenTelemetry EventLogger instance

194

"""

195

196

197

def emit_streaming_converse_response_event(

198

event_logger,

199

response_msg,

200

role,

201

finish_reason

202

) -> None:

203

"""

204

Emit events for streaming converse API responses.

205

206

Handles event emission for streaming responses from the

207

converse_stream API, including role and completion metadata.

208

209

Parameters:

210

- event_logger: OpenTelemetry EventLogger instance

211

- response_msg: Accumulated response message content

212

- role: Message role (typically 'assistant')

213

- finish_reason: Reason streaming completed

214

"""

215

216

217

def emit_event(event: Union[MessageEvent, ChoiceEvent], event_logger) -> None:

218

"""

219

Generic event emission function.

220

221

Low-level function for emitting structured events to OpenTelemetry.

222

Used internally by other emission functions.

223

224

Parameters:

225

- event: MessageEvent or ChoiceEvent to emit

226

- event_logger: OpenTelemetry EventLogger instance

227

"""

228

```

229

230

### Event Control Functions

231

232

Functions for determining when events should be emitted based on configuration.

233

234

```python { .api }

235

def should_emit_events() -> bool:

236

"""

237

Check if event emission is enabled.

238

239

Returns:

240

Boolean indicating whether structured events should be emitted

241

based on the use_legacy_attributes configuration setting.

242

"""

243

```

244

245

## Usage Examples

246

247

### Basic Event Emission

248

249

```python

250

from opentelemetry._events import get_event_logger

251

from opentelemetry.instrumentation.bedrock.event_models import MessageEvent, ChoiceEvent

252

from opentelemetry.instrumentation.bedrock.event_emitter import emit_event

253

254

# Get event logger

255

event_logger = get_event_logger(__name__)

256

257

# Create and emit a message event

258

message_event = MessageEvent(

259

content="What is the capital of France?",

260

role="user"

261

)

262

emit_event(message_event, event_logger)

263

264

# Create and emit a choice event

265

choice_event = ChoiceEvent(

266

index=0,

267

message={"content": "The capital of France is Paris.", "role": "assistant"},

268

finish_reason="stop"

269

)

270

emit_event(choice_event, event_logger)

271

```

272

273

### Event Emission in Instrumentation Context

274

275

Events are automatically emitted when semantic conventions are enabled:

276

277

```python

278

from opentelemetry.instrumentation.bedrock import BedrockInstrumentor

279

280

# Enable semantic conventions (disables legacy attributes)

281

BedrockInstrumentor(use_legacy_attributes=False).instrument()

282

283

# Events will be automatically emitted for all Bedrock API calls

284

```

285

286

### Tool Call Events

287

288

```python

289

from opentelemetry.instrumentation.bedrock.event_models import MessageEvent, ToolCall

290

291

# Message with tool calls

292

tool_call = {

293

"id": "call_123",

294

"function": {"name": "get_weather", "arguments": '{"location": "Paris"}'},

295

"type": "function"

296

}

297

298

message_event = MessageEvent(

299

content="I need to check the weather in Paris",

300

role="user",

301

tool_calls=[tool_call]

302

)

303

```

304

305

### Streaming Event Handling

306

307

For streaming responses, events are emitted when the stream completes:

308

309

```python

310

# Streaming events are handled automatically by the instrumentation

311

# and emitted when the stream finishes processing

312

313

import boto3

314

from opentelemetry.instrumentation.bedrock import BedrockInstrumentor

315

316

BedrockInstrumentor(use_legacy_attributes=False).instrument()

317

318

client = boto3.client('bedrock-runtime', region_name='us-east-1')

319

320

# Events will be automatically emitted as the stream completes

321

response = client.invoke_model_with_response_stream(

322

modelId='anthropic.claude-3-sonnet-20240229-v1:0',

323

body='{"messages": [{"role": "user", "content": "Hello"}]}'

324

)

325

326

# Process stream - events emitted automatically

327

for event in response['body']:

328

# Process streaming data

329

pass

330

```

331

332

## Event Schema

333

334

Events follow OpenTelemetry semantic conventions for AI observability:

335

336

### Message Event Schema

337

338

```json

339

{

340

"name": "gen_ai.content.prompt",

341

"body": {

342

"content": "What is the capital of France?",

343

"role": "user",

344

"tool_calls": []

345

},

346

"attributes": {

347

"gen_ai.system": "bedrock",

348

"gen_ai.request.model": "anthropic.claude-3-sonnet-20240229-v1:0"

349

}

350

}

351

```

352

353

### Choice Event Schema

354

355

```json

356

{

357

"name": "gen_ai.content.completion",

358

"body": {

359

"index": 0,

360

"message": {

361

"content": "The capital of France is Paris.",

362

"role": "assistant"

363

},

364

"finish_reason": "stop"

365

},

366

"attributes": {

367

"gen_ai.system": "bedrock",

368

"gen_ai.response.model": "anthropic.claude-3-sonnet-20240229-v1:0"

369

}

370

}

371

```

372

373

## Configuration

374

375

Event emission is controlled by the `use_legacy_attributes` setting:

376

377

- **use_legacy_attributes=True** (default): Events are not emitted, span attributes are used instead

378

- **use_legacy_attributes=False**: Structured events are emitted following semantic conventions

379

380

This allows gradual migration from legacy attribute-based observability to modern event-based observability patterns.