CtrlK
BlogDocsLog inGet started
Tessl Logo

data-engineering

Transforms, validates, and loads data in ETL pipelines. Use when building scrapers, validating NDJSON feeds, or importing data into CMS/DB targets.

100

Quality

100%

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Advisory

Suggest reviewing before use

SKILL.md
Quality
Evals
Security

Quality

Discovery

100%

Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.

This is a strong, well-crafted description that concisely covers specific capabilities, includes an explicit 'Use when' clause with natural trigger terms, and carves out a distinct niche around ETL/data pipeline work. It uses proper third-person voice and avoids vague language or unnecessary verbosity.

DimensionReasoningScore

Specificity

Lists multiple specific concrete actions: 'transforms', 'validates', and 'loads data in ETL pipelines', plus mentions scrapers, NDJSON feeds, and CMS/DB targets as concrete use cases.

3 / 3

Completeness

Clearly answers both 'what' (transforms, validates, loads data in ETL pipelines) and 'when' (explicit 'Use when' clause covering building scrapers, validating NDJSON feeds, or importing data into CMS/DB targets).

3 / 3

Trigger Term Quality

Includes strong natural keywords users would say: 'ETL', 'scrapers', 'NDJSON', 'feeds', 'importing data', 'CMS', 'DB', 'validates', 'pipelines'. These cover common variations of how users would describe data pipeline work.

3 / 3

Distinctiveness Conflict Risk

The combination of ETL pipelines, NDJSON feeds, scrapers, and CMS/DB targets creates a clear niche that is unlikely to conflict with general data processing or database skills. The specificity of 'NDJSON' and 'ETL' strongly narrows the domain.

3 / 3

Total

12

/

12

Passed

Implementation

100%

Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.

This is an excellent skill file that is concise, highly actionable, and well-structured. The numbered workflow with explicit checkpoints and recovery steps is particularly strong for a data pipeline context involving destructive batch operations. The progressive disclosure to REFERENCE.md is clean and well-signaled.

DimensionReasoningScore

Conciseness

Every section earns its place: schema table is compact, workflow steps are dense with checkpoints, code examples are minimal but functional. No unnecessary explanations of what NDJSON is or how browsers work—assumes Claude's competence throughout.

3 / 3

Actionability

Provides executable bash pipeline commands, a complete inline Zod-based NDJSON validator, specific config values (retryLimit, timeout), and a concrete schema table. Guidance is copy-paste ready and specific.

3 / 3

Workflow Clarity

The 5-step workflow includes explicit checkpoints at each stage, recovery actions for failures, a backup step before destructive import, and a revert-on-failure instruction. This is a textbook example of validation-gated workflow with feedback loops for a batch/destructive operation.

3 / 3

Progressive Disclosure

SKILL.md serves as a concise overview with clear one-level-deep references to REFERENCE.md for full scraper code, extended validators, and project-specific schemas. Content is appropriately split between inline essentials and external detail.

3 / 3

Total

12

/

12

Passed

Validation

100%

Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.

Validation11 / 11 Passed

Validation for skill structure

No warnings or errors.

Repository
monkilabs/opencastle
Reviewed

Table of Contents

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.