CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl-labs/audit-logs

Collect and normalize agent logs, discover installed verifiers, and dispatch LLM judges to evaluate adherence. Produces per-session verdicts and aggregated reports.

91

3.09x
Quality

90%

Does it follow best practices?

Impact

96%

3.09x

Average score across 3 eval scenarios

SecuritybySnyk

Passed

No known issues

Overview
Quality
Evals
Security
Files

task.mdevals/scenario-3/

Add Compliance Verifiers to the Data Pipeline Tile

Problem Description

Your team maintains a local tile at tiles/data-pipeline/ that helps agents manage ETL workflows. Recently the team lead noticed that agents aren't consistently following the skill's instructions — they sometimes use the wrong library, skip required steps, or produce logs in the wrong format. The tile is installed locally (its tessl.json shows "source": "file:tiles/data-pipeline/").

You've been asked to create verifiers for the ingest skill so the audit pipeline can automatically check whether agents are following the skill's rules. The skill has clear preferences around which tools to use, how to log progress, and what to avoid.

Output Specification

Create all verifier files in the correct location for a locally-sourced tile. The verifiers should cover every instruction in the skill below: tool preferences, workflow requirements, naming rules, and prohibitions. Include an activation verifier to track whether the skill is loaded.

Produce a verifier-summary.md file listing the verifiers you created and where they live.

Input Files

The following files are provided as inputs. Extract them before beginning.

=============== FILE: tiles/data-pipeline/tile.json =============== { "name": "acme/data-pipeline", "version": "0.3.1", "skills": { "ingest": { "path": "skills/ingest/SKILL.md" } } }

=============== FILE: tiles/data-pipeline/skills/ingest/SKILL.md ===============

name: ingest description: "Ingest data from external sources into the warehouse. Use when loading CSVs, API responses, or database exports into the pipeline."

Ingest Skill

Load data from external sources into the data warehouse pipeline.

Library

Always use pandas for data loading and transformation. Do not use polars, dask, or raw Python file I/O for tabular data.

Validation

Before writing any data to the warehouse, call validate_schema(df, schema) from pipeline.validation. Never skip schema validation even for small files.

Logging

Log every ingest operation using structlog. Include both source and row_count fields in every log call. Do not use Python's built-in logging module.

File Naming

Output parquet files must follow the pattern {source}_{YYYYMMDD}.parquet. Never use underscores in the source portion — use hyphens instead (e.g. user-events_20240315.parquet).

Error Handling

Wrap all external API calls in a try/except block. On failure, log the error with log.error() and raise a PipelineError, never silently swallow exceptions.

evals

tile.json