CtrlK
BlogDocsLog inGet started
Tessl Logo

etl-pipeline-design

This skill should be used when the user asks to "design an ETL pipeline", "build data ingestion", "set up data orchestration", "troubleshoot pipeline issues", "optimize data workflows", or mentions ELT, medallion architecture, batch vs streaming, or data transformation patterns.

Install with Tessl CLI

npx tessl i github:back1ply/LLM-Skills --skill etl-pipeline-design
What are skills?

Overall
score

67%

Does it follow best practices?

Validation for skill structure

SKILL.md
Review
Evals

Discovery

37%

Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.

This description has the unusual problem of being all trigger terms with no capability description. While it excels at listing when to use the skill with comprehensive natural language triggers, it completely fails to explain what the skill actually does. A user or Claude cannot determine what actions or outputs this skill provides.

Suggestions

Add a capability statement at the beginning describing concrete actions (e.g., 'Designs and implements ETL/ELT pipelines, configures data orchestration tools, creates transformation logic, and troubleshoots data flow issues.')

Restructure to lead with 'what it does' followed by 'Use when...' clause - the current format inverts the expected pattern

Include specific outputs or deliverables the skill produces (e.g., 'pipeline configurations', 'DAG definitions', 'transformation scripts')

DimensionReasoningScore

Specificity

The description contains no concrete actions - it only lists trigger phrases without explaining what the skill actually does. There are no specific capabilities like 'extract data', 'transform schemas', or 'schedule jobs' mentioned.

1 / 3

Completeness

The description only addresses 'when' (trigger conditions) but completely omits 'what' - there is no explanation of what capabilities or actions this skill provides. This is the inverse of the typical problem.

1 / 3

Trigger Term Quality

Excellent coverage of natural terms users would say: 'design an ETL pipeline', 'build data ingestion', 'set up data orchestration', 'troubleshoot pipeline issues', 'optimize data workflows', plus technical terms like 'ELT', 'medallion architecture', 'batch vs streaming'.

3 / 3

Distinctiveness Conflict Risk

The ETL/data pipeline domain is fairly specific, and terms like 'medallion architecture' and 'data orchestration' are distinctive. However, without knowing what the skill actually does, it's unclear how it would differentiate from other data-related skills.

2 / 3

Total

7

/

12

Passed

Implementation

73%

Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.

This is a well-structured, comprehensive ETL pipeline design skill that excels in organization and token efficiency. The content provides excellent conceptual frameworks and decision criteria but could benefit from more executable code examples and explicit validation checkpoints in workflows. The progressive disclosure to reference files is exemplary.

Suggestions

Add executable code examples for common ingestion patterns (e.g., Python code for API ingestion, database extraction) beyond the single UPSERT SQL example

Include explicit validation checkpoints in the Decision Framework, such as 'Validate schema before loading' or 'Test transformation logic on sample data before full run'

Add a concrete troubleshooting workflow with validation steps: detect issue → diagnose using lineage → fix → verify fix → deploy

DimensionReasoningScore

Conciseness

The content is highly efficient with well-organized tables, minimal prose, and no unnecessary explanations of concepts Claude would already know. Every section delivers actionable information without padding.

3 / 3

Actionability

Provides good conceptual frameworks, checklists, and one executable SQL example, but most guidance is descriptive rather than executable. Decision trees use text diagrams rather than concrete code, and many patterns lack copy-paste ready implementations.

2 / 3

Workflow Clarity

The Decision Framework section provides a clear 8-step sequence, but lacks explicit validation checkpoints or feedback loops for error recovery. For pipeline design involving potentially destructive operations, missing validation steps cap this at 2.

2 / 3

Progressive Disclosure

Excellent structure with a clear overview, well-organized sections, and explicit one-level-deep references to detailed pattern files at the end. Navigation is easy and content is appropriately split between overview and detailed reference files.

3 / 3

Total

10

/

12

Passed

Validation

100%

Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.

Validation11 / 11 Passed

Validation for skill structure

No warnings or errors.

Reviewed

Table of Contents

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.