CtrlK
BlogDocsLog inGet started
Tessl Logo

authoring-dags

Workflow and best practices for writing Apache Airflow DAGs. Use when the user wants to create a new DAG, write pipeline code, or asks about DAG patterns and conventions. For testing and debugging DAGs, see the testing-dags skill.

89

Quality

87%

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Passed

No known issues

SKILL.md
Quality
Evals
Security

Quality

Discovery

89%

Based on the skill's description, can an agent find and select it at the right time? Clear, specific descriptions lead to better discovery.

This is a solid skill description that clearly identifies its domain (Airflow DAG writing), provides explicit trigger guidance via a 'Use when' clause, and helpfully delineates its boundary from a related testing skill. The main weakness is that the 'what' portion could be more specific about the concrete actions and capabilities covered, such as defining operators, setting schedules, or configuring task dependencies.

Suggestions

Add more specific concrete actions to the capability list, e.g., 'Covers defining operators, setting schedules, configuring task dependencies, and structuring DAG files.'

DimensionReasoningScore

Specificity

Names the domain (Apache Airflow DAGs) and mentions some actions ('create a new DAG', 'write pipeline code', 'DAG patterns and conventions'), but doesn't list specific concrete actions like defining operators, setting schedules, configuring dependencies, or handling retries.

2 / 3

Completeness

Clearly answers both 'what' (workflow and best practices for writing Airflow DAGs) and 'when' (explicit 'Use when' clause covering creating DAGs, writing pipeline code, or asking about patterns). Also helpfully distinguishes scope by pointing to a separate testing-dags skill.

3 / 3

Trigger Term Quality

Includes strong natural keywords users would say: 'Airflow', 'DAG', 'DAGs', 'pipeline code', 'DAG patterns', 'conventions', 'create a new DAG'. These cover the most common ways users would phrase requests about writing Airflow DAGs.

3 / 3

Distinctiveness Conflict Risk

Clearly scoped to Apache Airflow DAG authoring specifically, with explicit boundary drawn against the testing-dags skill. The combination of 'Airflow' + 'DAG' + 'writing/creating' makes this highly distinctive and unlikely to conflict with other skills.

3 / 3

Total

11

/

12

Passed

Implementation

85%

Reviews the quality of instructions and guidance provided to agents. Good implementation is clear, handles edge cases, and produces reliable results.

This is a well-structured skill with strong actionability and workflow clarity. The phased approach with explicit validation checkpoints and clear CLI command references makes it highly usable. The main weakness is moderate verbosity—the ASCII diagram and some thin phases (Plan, Implement) could be tightened to save tokens without losing clarity.

Suggestions

Replace the ASCII workflow diagram with a compact numbered list (e.g., '1. Discover → 2. Plan → 3. Implement → 4. Validate → 5. Test → 6. Iterate') to save ~15 lines of tokens.

Condense Phase 2 (Plan) and Phase 3 (Implement) into a single brief section since they contain minimal unique actionable content beyond 'propose structure and get approval' and 'write the DAG'.

DimensionReasoningScore

Conciseness

The content is mostly efficient but includes some unnecessary elements: the large ASCII workflow diagram could be replaced with a simple numbered list, and some phases (Plan, Implement) are thin on unique content while still taking up space. The discovery example questions are a nice touch but slightly verbose.

2 / 3

Actionability

Provides concrete, executable CLI commands throughout (af dags errors, af dags get, af runs trigger-wait), specific glob patterns for discovery, clear tables mapping commands to purposes, and copy-paste ready bash examples. Each phase has specific actions to take.

3 / 3

Workflow Clarity

The 6-phase workflow is clearly sequenced with explicit validation checkpoints in Phase 4 (check errors → verify DAG exists → check warnings → explore structure), a feedback loop in Phase 6 (fix → re-validate → re-test), and conditional branching (if errors appear → fix and retry). The validation phase is particularly well-structured with ordered steps.

3 / 3

Progressive Disclosure

Excellent progressive disclosure: the skill provides a clear overview with well-signaled one-level-deep references to testing-dags, debugging-dags, deploying-airflow, migrating-airflow-2-to-3, and reference/best-practices.md. Testing is appropriately delegated to the testing-dags skill rather than duplicated inline. The Related Skills section provides clear navigation.

3 / 3

Total

11

/

12

Passed

Validation

90%

Checks the skill against the spec for correct structure and formatting. All validation checks must pass before discovery and implementation can be scored.

Validation10 / 11 Passed

Validation for skill structure

CriteriaDescriptionResult

frontmatter_unknown_keys

Unknown frontmatter key(s) found; consider removing or moving to metadata

Warning

Total

10

/

11

Passed

Repository
astronomer/agents
Reviewed

Table of Contents

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.