CtrlK
BlogDocsLog inGet started
Tessl Logo

databricks-zerobus-ingest

Build Zerobus Ingest clients for near real-time data ingestion into Databricks Delta tables via gRPC. Use when creating producers that write directly to Unity Catalog tables without a message bus, working with the Zerobus Ingest SDK in Python/Java/Go/TypeScript/Rust, generating Protobuf schemas from UC tables, or implementing stream-based ingestion with ACK handling and retry logic.

89

Quality

86%

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Risky

Do not use without reviewing

SKILL.md
Quality
Evals
Security

Quality

Discovery

100%

Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.

This is an excellent skill description that clearly defines a narrow, specific domain (Zerobus Ingest SDK for Databricks Delta table ingestion) with concrete actions and explicit trigger conditions. It uses third-person voice correctly, includes comprehensive natural trigger terms across multiple programming languages, and has virtually no conflict risk due to its highly specialized niche. The 'Use when...' clause effectively enumerates four distinct scenarios that would trigger this skill.

DimensionReasoningScore

Specificity

Lists multiple specific concrete actions: building Zerobus Ingest clients, writing to Unity Catalog tables via gRPC, generating Protobuf schemas from UC tables, implementing stream-based ingestion with ACK handling and retry logic. Very detailed and actionable.

3 / 3

Completeness

Clearly answers both 'what' (build Zerobus Ingest clients for near real-time data ingestion into Databricks Delta tables via gRPC) and 'when' with an explicit 'Use when...' clause listing four distinct trigger scenarios.

3 / 3

Trigger Term Quality

Excellent coverage of natural terms a user would say: 'Zerobus Ingest', 'Databricks Delta tables', 'gRPC', 'Unity Catalog', 'Protobuf schemas', 'SDK', 'Python/Java/Go/TypeScript/Rust', 'stream-based ingestion', 'ACK handling', 'retry logic'. These are the exact terms a developer working in this domain would use.

3 / 3

Distinctiveness Conflict Risk

Highly distinctive with a very clear niche: Zerobus Ingest is a specific product/SDK, combined with Databricks Delta tables, Unity Catalog, and gRPC. This is extremely unlikely to conflict with other skills due to the specificity of the technology stack.

3 / 3

Total

12

/

12

Passed

Implementation

72%

Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.

This is a well-structured skill with strong progressive disclosure and actionable guidance, including executable code examples and clear routing to sub-documents. Its main weaknesses are redundancy (duplicated context reuse and failure handling sections, repeated documentation links) and a workflow that lacks explicit validation/verification checkpoints after ingestion. The 'Critical Learning' about timestamps feels like a patch rather than integrated guidance.

Suggestions

Remove duplicate content: consolidate 'Context Reuse Pattern' and 'Handling Failures' into a single section, and remove the repeated documentation links at the bottom.

Add an explicit validation step to the workflow (e.g., 'After ingestion, verify row count: SELECT COUNT(*) FROM target_table') to ensure successful data landing.

Integrate the timestamp format fix into the Python example or the workflow steps rather than leaving it as a standalone 'Critical Learning' callout.

Trim the 'What Is Zerobus Ingest?' and 'Key Concepts' sections — Claude doesn't need explanations of gRPC or at-least-once semantics; focus on Zerobus-specific constraints only.

DimensionReasoningScore

Conciseness

The skill has some redundancy — the 'Context Reuse Pattern' and 'Handling Failures' sections repeat nearly identical information, the documentation links appear twice (top and bottom), and the 'What Is Zerobus Ingest?' section explains concepts Claude could infer. The 'Key Concepts' section includes some unnecessary explanation (e.g., what gRPC is). However, the decision table and troubleshooting table are efficient.

2 / 3

Actionability

The skill provides a fully executable minimal Python example, concrete pip install commands, specific SDK version requirements, and clear tool invocation patterns (execute_code with file_path, cluster_id, context_id parameters). The decision table maps scenarios directly to reference files with language and serialization specifics.

3 / 3

Workflow Clarity

The workflow section has numbered steps with a clear sequence including error recovery (step 5), but validation checkpoints are weak — there's no explicit validation step after ingestion (e.g., checking ACKs, verifying row counts in the target table). The prerequisite validation instruction ('never execute without confirming') is vague about how to confirm. The timestamp fix is flagged as critical but not integrated into the workflow.

2 / 3

Progressive Disclosure

Excellent structure with a concise overview, quick decision table routing to specific sub-files, and a detailed guides table with clear 'When to Read' guidance. References are one level deep and well-signaled. Content is appropriately split between the main skill and five sub-documents.

3 / 3

Total

10

/

12

Passed

Validation

100%

Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.

Validation11 / 11 Passed

Validation for skill structure

No warnings or errors.

Repository
databricks-solutions/ai-dev-kit
Reviewed

Table of Contents

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.