Execute comprehensive platform migrations to Databricks from legacy systems. Use when migrating from on-premises Hadoop, other cloud platforms, or legacy data warehouses to Databricks. Trigger with phrases like "migrate to databricks", "hadoop migration", "snowflake to databricks", "legacy migration", "data warehouse migration".
85
83%
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Passed
No known issues
Quality
Discovery
89%Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.
This is a well-structured skill description with strong trigger terms and explicit 'Use when' guidance that makes it easy for Claude to select appropriately. Its main weakness is the lack of specific concrete actions—it says 'execute comprehensive platform migrations' without detailing what that entails (e.g., schema conversion, query translation, pipeline migration, data transfer). Adding specific capabilities would elevate the description further.
Suggestions
Replace 'execute comprehensive platform migrations' with specific concrete actions like 'convert Hive/SQL queries to Spark SQL, migrate ETL pipelines, transfer schemas and data, reconfigure access controls' to improve specificity.
| Dimension | Reasoning | Score |
|---|---|---|
Specificity | It names the domain (platform migrations to Databricks) and mentions source systems (Hadoop, cloud platforms, legacy data warehouses), but doesn't list specific concrete actions like 'convert Hive queries to Spark SQL', 'migrate ETL pipelines', or 'transfer data schemas'. The phrase 'execute comprehensive platform migrations' is somewhat vague about what concrete steps are involved. | 2 / 3 |
Completeness | Clearly answers both 'what' (execute platform migrations to Databricks from legacy systems) and 'when' (explicit 'Use when' clause specifying migration scenarios, plus a 'Trigger with phrases' section listing specific trigger terms). Both components are explicitly stated. | 3 / 3 |
Trigger Term Quality | Excellent coverage of natural trigger terms including 'migrate to databricks', 'hadoop migration', 'snowflake to databricks', 'legacy migration', 'data warehouse migration'. These are phrases users would naturally say, and the description also includes contextual terms like 'on-premises Hadoop', 'cloud platforms', and 'legacy data warehouses'. | 3 / 3 |
Distinctiveness Conflict Risk | Highly distinctive with a clear niche: Databricks-specific migrations from named source platforms. The specific mention of Databricks, Hadoop, Snowflake, and legacy data warehouses creates a well-defined scope that is unlikely to conflict with general data engineering or other platform-specific skills. | 3 / 3 |
Total | 11 / 12 Passed |
Implementation
77%Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.
This is a strong, highly actionable migration skill with executable code for each step and good validation/rollback coverage. Its main weakness is length—it's a monolithic document that could benefit from splitting source-specific migration patterns into separate files. The workflow is well-sequenced with explicit validation checkpoints and error handling.
Suggestions
Split source-specific migration details (Snowflake, Redshift, JDBC/legacy) into separate referenced files to improve progressive disclosure and reduce the main skill's token footprint.
Remove the Overview paragraph which largely restates the YAML description and step headings—jump straight to Prerequisites or the Migration Patterns table.
| Dimension | Reasoning | Score |
|---|---|---|
Conciseness | The skill is fairly comprehensive but includes some unnecessary verbosity—the overview section restates what the title and description already convey, and some code comments explain things Claude would already know. The migration patterns table and type mapping are efficient, but the overall length (~250 lines) could be tightened, especially the cutover planning section which uses a Python list of dicts where a simple table would suffice. | 2 / 3 |
Actionability | The skill provides fully executable Python and SQL code for each migration step—discovery/assessment, schema conversion, data migration with multiple methods (SYNC, DEEP CLONE, CTAS, JDBC), Snowflake/Redshift-specific patterns, and bulk migration scripts. Code is copy-paste ready with concrete function signatures, type mappings, and validation logic. | 3 / 3 |
Workflow Clarity | The 6-step workflow is clearly sequenced from discovery through cutover, with explicit validation at each stage (row count matching in Step 3, validation queries in Step 6). The cutover plan includes rollback procedures for each step, and the error handling table addresses common failure modes with specific solutions. The migrate_table function includes built-in validation with match/mismatch status. | 3 / 3 |
Progressive Disclosure | The skill is a long monolithic document (~250+ lines of content) that could benefit from splitting detailed code (e.g., Snowflake/Redshift specifics, ETL conversion patterns) into separate referenced files. The Resources section links to external docs which is good, but the inline content is dense and would be better organized with a concise overview pointing to detailed sub-files for each migration source type. | 2 / 3 |
Total | 10 / 12 Passed |
Validation
81%Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.
Validation — 9 / 11 Passed
Validation for skill structure
| Criteria | Description | Result |
|---|---|---|
allowed_tools_field | 'allowed-tools' contains unusual tool name(s) | Warning |
frontmatter_unknown_keys | Unknown frontmatter key(s) found; consider removing or moving to metadata | Warning |
Total | 9 / 11 Passed | |
70e9fa4
Table of Contents
If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.