Upgrade Databricks runtime versions and migrate between features. Use when upgrading DBR versions, migrating to Unity Catalog, or updating deprecated APIs and features. Trigger with phrases like "databricks upgrade", "DBR upgrade", "databricks migration", "unity catalog migration", "hive to unity".
68
83%
Does it follow best practices?
Impact
—
No eval scenarios have been run
Passed
No known issues
Quality
Discovery
89%Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.
This is a well-structured skill description with strong trigger terms and clear 'what/when' guidance. The explicit trigger phrases section is a notable strength that aids skill selection. The main weakness is that the capability description could be more specific about the concrete actions performed (e.g., updating specific API calls, modifying cluster configurations, rewriting Hive metastore references).
Suggestions
Add more specific concrete actions to the first sentence, e.g., 'Updates cluster configurations, rewrites deprecated API calls, converts Hive metastore references to Unity Catalog namespaces' to improve specificity.
| Dimension | Reasoning | Score |
|---|---|---|
Specificity | Names the domain (Databricks runtime) and some actions (upgrade versions, migrate between features, updating deprecated APIs), but doesn't list comprehensive concrete actions like specific migration steps, code transformations, or configuration changes. | 2 / 3 |
Completeness | Clearly answers both 'what' (upgrade Databricks runtime versions, migrate between features) and 'when' (explicit 'Use when' clause with upgrading DBR versions, migrating to Unity Catalog, updating deprecated APIs, plus explicit trigger phrases). | 3 / 3 |
Trigger Term Quality | Excellent coverage of natural trigger terms including 'databricks upgrade', 'DBR upgrade', 'databricks migration', 'unity catalog migration', 'hive to unity' — these are phrases users would naturally say when needing this skill. | 3 / 3 |
Distinctiveness Conflict Risk | Highly distinctive with a clear niche around Databricks runtime upgrades and Unity Catalog migration. The specific trigger terms like 'DBR upgrade' and 'hive to unity' are unlikely to conflict with other skills. | 3 / 3 |
Total | 11 / 12 Passed |
Implementation
77%Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.
This is a strong, highly actionable skill with executable code for every major step and good validation checkpoints throughout. Its main weakness is that it packs a lot of content into a single file without leveraging bundle files for progressive disclosure, making it token-heavy for the context window. Minor verbosity in framing sections (Prerequisites, Output) could be trimmed.
Suggestions
Split detailed content (bulk migration script, API migration examples, version compatibility matrix) into separate bundle files referenced from the main SKILL.md to improve progressive disclosure and reduce token cost.
Remove the 'Output' section (it restates what the steps already accomplish) and trim 'Prerequisites' to a single line or remove it entirely to improve conciseness.
| Dimension | Reasoning | Score |
|---|---|---|
Conciseness | The skill is fairly efficient and avoids explaining basic concepts, but includes some unnecessary sections like 'Prerequisites' (Claude knows to check access) and the 'Output' section which just restates what the steps already cover. The version compatibility matrix and error handling table earn their place, but overall it could be tightened. | 2 / 3 |
Actionability | Excellent actionability throughout — executable Python scripts with the Databricks SDK, complete SQL statements for each migration option (SYNC/CTAS/DEEP CLONE), concrete API migration examples showing old vs new patterns, and a bulk migration script with row-count verification. All code is copy-paste ready with real SDK classes and methods. | 3 / 3 |
Workflow Clarity | The four steps are clearly sequenced (runtime upgrade → UC migration → API migration → Delta protocol upgrade). Validation checkpoints are present: dry_run before applying cluster changes, row count verification after table migration, DESCRIBE DETAIL before protocol upgrade, and explicit warnings about irreversible operations. The error handling table provides a feedback loop for common failure modes. | 3 / 3 |
Progressive Disclosure | The content is well-structured with clear sections and a logical flow, but it's a long monolithic file (~200 lines of substantive content) with no bundle files to offload detail into. The version compatibility matrix, bulk migration script, and API migration examples could each be separate referenced files. The 'Resources' section links to external docs and 'Next Steps' references another skill, which is good but doesn't compensate for the inline density. | 2 / 3 |
Total | 10 / 12 Passed |
Validation
81%Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.
Validation — 9 / 11 Passed
Validation for skill structure
| Criteria | Description | Result |
|---|---|---|
allowed_tools_field | 'allowed-tools' contains unusual tool name(s) | Warning |
frontmatter_unknown_keys | Unknown frontmatter key(s) found; consider removing or moving to metadata | Warning |
Total | 9 / 11 Passed | |
09b10d6
Table of Contents
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.