CtrlK
BlogDocsLog inGet started
Tessl Logo

coding-agent-helpers/compact-handoff

Use when work needs to be handed off to another agent or another human. Produce a continuation-ready brief with the objective, completed work, assumptions, unresolved issues, and next action instead of a generic summary. Good triggers include "prepare a handoff", "make this resumable", and "summarize this for another agent".

92

1.41x
Quality

100%

Does it follow best practices?

Impact

89%

1.41x

Average score across 8 eval scenarios

SecuritybySnyk

Passed

No known issues

Overview
Quality
Evals
Security
Files

task.mdevals/scenario-4/

ETL Pipeline — Shift Change Brief

A data engineering team is building a nightly ETL pipeline that ingests raw sales data from an S3 bucket, transforms it, and loads it into a data warehouse. The engineer who built the first stages is rotating off the project and needs to hand over to someone new.

Current state

What is working

  • Extraction step: reads Parquet files from s3://sales-data-raw/daily/ using the Spark job in jobs/extract.py; runs successfully in the test environment
  • Basic schema validation: checks for required columns and rejects files with >5% nulls in the order_id field; logic is in jobs/validate.py
  • The two jobs are orchestrated by Airflow; the DAG definition is in dags/sales_pipeline.py and has been deployed to the staging Airflow instance

What is not yet done

  • Transformation step (jobs/transform.py) is stubbed out — currency normalisation and deduplication logic are not implemented
  • Load step (jobs/load.py) does not exist yet
  • No error alerting is configured; failed runs silently succeed in Airflow
  • Performance of the extraction job has not been profiled at production data volumes (estimated 10× larger than test data)

Open decisions

  • Should deduplication be done in Spark (during transform) or in the warehouse using a merge strategy? The tech lead has a preference but it has not been confirmed in writing
  • The warehouse target table schema has a draft but was not signed off by the analytics team

Relevant commands

  • Run tests: pytest jobs/tests/ -q
  • Trigger manual Airflow run: airflow dags trigger sales_pipeline
  • Airflow UI (staging): http://airflow-staging.internal:8080

Write a handoff document for the engineer taking over the pipeline build.

evals

tile.json