0
# Tracer & Spans
1
2
OpenInference-aware tracer provider and tracer implementations with decorator support for creating spans with automatic attribute handling and span kind detection.
3
4
## Capabilities
5
6
### TracerProvider
7
8
OpenInference-aware TracerProvider that extends OpenTelemetry's TracerProvider with additional functionality.
9
10
```python { .api }
11
class TracerProvider(OTelTracerProvider):
12
"""
13
OpenInference TracerProvider with enhanced span limits and configuration support.
14
15
Args:
16
config (Optional[TraceConfig]): Configuration for data privacy and tracing behavior
17
*args: Arguments passed to OpenTelemetry TracerProvider
18
**kwargs: Keyword arguments passed to OpenTelemetry TracerProvider
19
"""
20
def __init__(
21
self,
22
*args: Any,
23
config: Optional[TraceConfig] = None,
24
**kwargs: Any,
25
) -> None: ...
26
27
def get_tracer(self, *args: Any, **kwargs: Any) -> OITracer:
28
"""
29
Get an OpenInference tracer instance.
30
31
Returns:
32
OITracer: OpenInference-aware tracer with decorator support
33
"""
34
```
35
36
**Usage Example:**
37
38
```python
39
from openinference.instrumentation import TracerProvider, TraceConfig
40
from opentelemetry import trace
41
42
# Create provider with configuration
43
config = TraceConfig(hide_inputs=True)
44
provider = TracerProvider(config=config)
45
46
# Set as global tracer provider
47
trace.set_tracer_provider(provider)
48
49
# Get tracer
50
tracer = provider.get_tracer(__name__)
51
```
52
53
### OITracer
54
55
OpenInference tracer with decorator support and custom span creation methods.
56
57
```python { .api }
58
class OITracer:
59
"""
60
OpenInference tracer wrapper with decorator support and enhanced span creation.
61
"""
62
63
def start_span(
64
self,
65
name: str,
66
context: Optional[Context] = None,
67
kind: SpanKind = SpanKind.INTERNAL,
68
attributes: Attributes = None,
69
links: Optional[Sequence[Link]] = (),
70
start_time: Optional[int] = None,
71
record_exception: bool = True,
72
set_status_on_exception: bool = True,
73
*,
74
openinference_span_kind: Optional[OpenInferenceSpanKind] = None,
75
) -> OpenInferenceSpan:
76
"""
77
Start a new OpenInference span.
78
79
Args:
80
name (str): Span name
81
openinference_span_kind (Optional[OpenInferenceSpanKind]): OpenInference span kind
82
**kwargs: Additional OpenTelemetry span arguments
83
84
Returns:
85
OpenInferenceSpan: OpenInference-aware span wrapper
86
"""
87
88
@contextmanager
89
def start_as_current_span(
90
self,
91
name: str,
92
context: Optional[Context] = None,
93
kind: SpanKind = SpanKind.INTERNAL,
94
attributes: Attributes = None,
95
links: Optional[Sequence[Link]] = (),
96
start_time: Optional[int] = None,
97
record_exception: bool = True,
98
set_status_on_exception: bool = True,
99
end_on_exit: bool = True,
100
*,
101
openinference_span_kind: Optional[OpenInferenceSpanKind] = None,
102
) -> Iterator[OpenInferenceSpan]:
103
"""
104
Context manager for creating and managing a span as current.
105
106
Args:
107
name (str): Span name
108
openinference_span_kind (Optional[OpenInferenceSpanKind]): OpenInference span kind
109
**kwargs: Additional OpenTelemetry span arguments
110
111
Yields:
112
OpenInferenceSpan: The current span
113
"""
114
```
115
116
**Usage Example:**
117
118
```python
119
from openinference.instrumentation import TracerProvider
120
from openinference.semconv.trace import OpenInferenceSpanKindValues
121
122
tracer = TracerProvider().get_tracer(__name__)
123
124
# Create span manually
125
span = tracer.start_span("my-operation", openinference_span_kind="llm")
126
span.set_attribute("custom.attribute", "value")
127
span.end()
128
129
# Use as context manager
130
with tracer.start_as_current_span("my-context", openinference_span_kind=OpenInferenceSpanKindValues.CHAIN) as span:
131
# Span is automatically current and will be ended
132
result = perform_operation()
133
span.set_output(result)
134
```
135
136
### Agent Decorator
137
138
Decorator for creating agent spans with automatic input/output handling.
139
140
```python { .api }
141
def agent(
142
self,
143
wrapped_function: Optional[Callable] = None,
144
/,
145
*,
146
name: Optional[str] = None,
147
) -> Union[Callable, Callable[[Callable], Callable]]:
148
"""
149
Decorator for creating agent spans.
150
151
Args:
152
wrapped_function: Function to wrap (when used without parentheses)
153
name (Optional[str]): Custom span name (defaults to function name)
154
155
Returns:
156
Decorated function that creates agent spans
157
"""
158
```
159
160
**Usage Example:**
161
162
```python
163
from openinference.instrumentation import TracerProvider
164
165
tracer = TracerProvider().get_tracer(__name__)
166
167
# Simple usage
168
@tracer.agent
169
def my_agent(query: str) -> str:
170
return f"Agent response to: {query}"
171
172
# With custom name
173
@tracer.agent(name="customer-support-agent")
174
def support_agent(question: str, context: dict) -> str:
175
return generate_support_response(question, context)
176
177
# Usage
178
response = my_agent("Hello, how are you?")
179
```
180
181
### Chain Decorator
182
183
Decorator for creating chain spans with automatic input/output handling.
184
185
```python { .api }
186
def chain(
187
self,
188
wrapped_function: Optional[Callable] = None,
189
/,
190
*,
191
name: Optional[str] = None,
192
) -> Union[Callable, Callable[[Callable], Callable]]:
193
"""
194
Decorator for creating chain spans.
195
196
Args:
197
wrapped_function: Function to wrap (when used without parentheses)
198
name (Optional[str]): Custom span name (defaults to function name)
199
200
Returns:
201
Decorated function that creates chain spans
202
"""
203
```
204
205
**Usage Example:**
206
207
```python
208
@tracer.chain
209
def process_pipeline(data: dict) -> dict:
210
# Multi-step processing pipeline
211
step1_result = preprocess(data)
212
step2_result = analyze(step1_result)
213
return finalize(step2_result)
214
215
@tracer.chain(name="rag-chain")
216
async def rag_pipeline(query: str) -> str:
217
docs = await retrieve_documents(query)
218
context = format_context(docs)
219
return await generate_response(query, context)
220
```
221
222
### Tool Decorator
223
224
Decorator for creating tool spans with automatic parameter inference.
225
226
```python { .api }
227
def tool(
228
self,
229
wrapped_function: Optional[Callable] = None,
230
/,
231
*,
232
name: Optional[str] = None,
233
description: Optional[str] = None,
234
parameters: Optional[Union[str, Dict[str, Any]]] = None,
235
) -> Union[Callable, Callable[[Callable], Callable]]:
236
"""
237
Decorator for creating tool spans.
238
239
Args:
240
wrapped_function: Function to wrap (when used without parentheses)
241
name (Optional[str]): Custom tool name (defaults to function name)
242
description (Optional[str]): Tool description (defaults to docstring)
243
parameters (Optional[Union[str, Dict[str, Any]]]): JSON schema or dict (auto-inferred from signature)
244
245
Returns:
246
Decorated function that creates tool spans
247
"""
248
```
249
250
**Usage Example:**
251
252
```python
253
@tracer.tool
254
def calculate_area(length: float, width: float) -> float:
255
"""Calculate the area of a rectangle."""
256
return length * width
257
258
@tracer.tool(
259
name="web-search",
260
description="Search the web for information",
261
parameters={"query": {"type": "string", "description": "Search query"}}
262
)
263
def web_search(query: str) -> list:
264
return perform_web_search(query)
265
266
# Parameters are automatically inferred from type hints
267
@tracer.tool
268
def get_weather(city: str, units: str = "celsius") -> dict:
269
"""Get weather information for a city."""
270
return fetch_weather_data(city, units)
271
```
272
273
### LLM Decorator
274
275
Decorator for creating LLM spans with support for generators and custom processing.
276
277
```python { .api }
278
def llm(
279
self,
280
wrapped_function: Optional[Callable] = None,
281
/,
282
*,
283
name: Optional[str] = None,
284
process_input: Optional[Callable] = None,
285
process_output: Optional[Callable] = None,
286
) -> Union[Callable, Callable[[Callable], Callable]]:
287
"""
288
Decorator for creating LLM spans.
289
290
Args:
291
wrapped_function: Function to wrap (when used without parentheses)
292
name (Optional[str]): Custom span name (defaults to function name)
293
process_input (Optional[Callable]): Custom input processing function
294
process_output (Optional[Callable]): Custom output processing function
295
296
Returns:
297
Decorated function that creates LLM spans
298
"""
299
```
300
301
**Usage Example:**
302
303
```python
304
@tracer.llm
305
def simple_llm_call(prompt: str) -> str:
306
return llm_client.generate(prompt)
307
308
# With custom processing
309
def process_llm_input(*args, **kwargs):
310
return {"llm.model_name": "gpt-4", "custom.metric": len(args)}
311
312
def process_llm_output(output):
313
return {"llm.token_count.total": count_tokens(output)}
314
315
@tracer.llm(
316
name="advanced-llm",
317
process_input=process_llm_input,
318
process_output=process_llm_output
319
)
320
def advanced_llm_call(messages: list) -> str:
321
return llm_client.chat(messages)
322
323
# Supports generators for streaming
324
@tracer.llm
325
def streaming_llm(prompt: str):
326
for chunk in llm_client.stream(prompt):
327
yield chunk
328
329
# Supports async generators
330
@tracer.llm
331
async def async_streaming_llm(prompt: str):
332
async for chunk in llm_client.async_stream(prompt):
333
yield chunk
334
```
335
336
### OpenInferenceSpan
337
338
Enhanced span wrapper with OpenInference-specific methods.
339
340
```python { .api }
341
class OpenInferenceSpan:
342
"""
343
OpenInference span wrapper with enhanced attribute handling.
344
"""
345
346
def set_input(
347
self,
348
value: Any,
349
*,
350
mime_type: Optional[OpenInferenceMimeType] = None,
351
) -> None:
352
"""
353
Set input attributes on the span.
354
355
Args:
356
value: Input value
357
mime_type (Optional[OpenInferenceMimeType]): MIME type for the input
358
"""
359
360
def set_output(
361
self,
362
value: Any,
363
*,
364
mime_type: Optional[OpenInferenceMimeType] = None,
365
) -> None:
366
"""
367
Set output attributes on the span.
368
369
Args:
370
value: Output value
371
mime_type (Optional[OpenInferenceMimeType]): MIME type for the output
372
"""
373
374
def set_tool(
375
self,
376
*,
377
name: str,
378
description: Optional[str] = None,
379
parameters: Union[str, Dict[str, Any]],
380
) -> None:
381
"""
382
Set tool attributes on the span.
383
384
Args:
385
name (str): Tool name
386
description (Optional[str]): Tool description
387
parameters (Union[str, Dict[str, Any]]): Tool parameters schema
388
"""
389
```
390
391
**Usage Example:**
392
393
```python
394
with tracer.start_as_current_span("manual-span", openinference_span_kind="tool") as span:
395
# Set input
396
span.set_input({"query": "search term"}, mime_type="application/json")
397
398
# Perform operation
399
result = search_operation()
400
401
# Set output
402
span.set_output(result)
403
404
# Set tool information
405
span.set_tool(
406
name="search-tool",
407
description="Searches the knowledge base",
408
parameters={"query": {"type": "string"}}
409
)
410
```
411
412
## Decorator Features
413
414
All decorators support:
415
416
- **Synchronous functions**: Regular function decoration
417
- **Asynchronous functions**: `async def` function decoration
418
- **Generators**: Functions that `yield` values
419
- **Async generators**: `async def` functions that `yield`
420
- **Methods**: Instance and class method decoration
421
- **Automatic input/output capture**: Function arguments and return values
422
- **Exception handling**: Proper span status setting on exceptions
423
- **Custom naming**: Override default span names