This skill should be used when the user asks to "add LLMObs support", "create an LLMObs plugin", "instrument an LLM library", "add LLM Observability", "add llmobs", "add llm observability", "instrument chat completions", "instrument streaming", "instrument embeddings", "instrument agent runs", "instrument orchestration", "instrument LLM", "LLMObsPlugin", "LlmObsPlugin", "getLLMObsSpanRegisterOptions", "setLLMObsTags", "tagLLMIO", "tagEmbeddingIO", "tagRetrievalIO", "tagTextIO", "tagMetrics", "tagMetadata", "tagSpanTags", "tagPrompt", "LlmObsCategory", "LlmObsSpanKind", "span kind llm", "span kind workflow", "span kind agent", "span kind embedding", "span kind tool", "span kind retrieval", "openai llmobs", "anthropic llmobs", "genai llmobs", "google llmobs", "langchain llmobs", "langgraph llmobs", "ai-sdk llmobs", "llm span", "llmobs span event", "model provider", "model name", "CompositePlugin llmobs", "llmobs tracing", "VCR cassettes", or needs to build, modify, or debug an LLMObs plugin for any LLM library in dd-trace-js.
This skill helps you create LLMObs plugins that instrument LLM library operations and emit proper span events for LLM observability in dd-trace-js. Supported operation types include:
All LLMObs plugins extend the LLMObsPlugin base class, which provides the core instrumentation framework.
Key responsibilities:
Required methods to implement:
getLLMObsSpanRegisterOptions(ctx) - Returns span registration options (modelProvider, modelName, kind, name)setLLMObsTags(ctx) - Extracts and tags LLM data (input/output messages, metrics, metadata)Plugin lifecycle:
start(ctx) - Registers span with LLMObs, captures contextasyncEnd(ctx) - Calls setLLMObsTags() to extract and tag dataend(ctx) - Restores parent contextSee references/plugin-architecture.md for complete implementation details.
CRITICAL: Every integration must be classified into one category using the LlmObsCategory enum. This determines test strategy and implementation approach.
LlmObsCategory.LLM_CLIENT - Direct API wrappers (openai, anthropic, genai)
LlmObsCategory.MULTI_PROVIDER - Multi-provider frameworks (ai-sdk, langchain)
LlmObsCategory.ORCHESTRATION - Workflow managers (langgraph)
LlmObsCategory.INFRASTRUCTURE - Protocols/servers (MCP)
Answer these questions by reading the code:
Does the package make direct HTTP calls to LLM provider endpoints?
Does it support multiple LLM providers via configuration?
LlmObsCategory.MULTI_PROVIDERLlmObsCategory.LLM_CLIENTDoes it implement workflow/graph orchestration with state management?
LlmObsCategory.ORCHESTRATIONLlmObsCategory.INFRASTRUCTURESee references/category-detection.md for detailed heuristics and examples.
Use the LlmObsSpanKind enum:
LlmObsSpanKind.LLM - Chat completions, text generationLlmObsSpanKind.WORKFLOW - Graph/chain executionLlmObsSpanKind.AGENT - Agent runsLlmObsSpanKind.TOOL - Tool/function callsLlmObsSpanKind.EMBEDDING - Embedding generationLlmObsSpanKind.RETRIEVAL - Vector DB/RAG retrievalMost common: Use 'llm' for chat completions/text generation in LLM_CLIENT and MULTI_PROVIDER categories.
All plugins must convert provider-specific message formats to the standard format:
Standard format: [{content: string, role: string}]
Common roles: 'user', 'assistant', 'system', 'tool'
Provider-specific handling:
function_call and tool_callsrole values, flatten nested content arraysparts arrays, map role namesSee references/message-extraction.md for provider-specific patterns.
Detect package category (REQUIRED FIRST STEP)
Create plugin file
packages/dd-trace/src/llmobs/plugins/{integration}/index.jsLLMObsPlugin base classImplement getLLMObsSpanRegisterOptions(ctx)
'llm')Implement setLLMObsTags(ctx)
ctx.argumentsctx.resultthis._tagger methodsHandle edge cases
See references/plugin-architecture.md for step-by-step implementation guide.
Based on category:
result.choices[0] or equivalent'workflow' span kind instead of 'llm', focus on lifecycle eventsAll plugins must export an array:
Static properties required:
integration - Integration name (e.g., 'openai')id - Unique plugin ID (e.g., 'llmobs_openai')prefix - Channel prefix (e.g., 'tracing:apm:openai:chat')For detailed information, see:
LlmObsCategory and LlmObsSpanKind enums from models[{content, role}] format50aa025
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.