CtrlK
BlogDocsLog inGet started
Tessl Logo

interval-difference-analyzer

Analyze differences in program intervals between two versions of a program (old and new) to identify added, removed, or modified intervals. Use when comparing program versions, analyzing variable ranges, detecting behavioral changes in numeric computations, validating refactorings, or assessing migration impacts. Supports optional test suite integration to validate interval changes. Generates comprehensive reports highlighting intervals requiring further testing or verification.

81

1.85x
Quality

71%

Does it follow best practices?

Impact

100%

1.85x

Average score across 3 eval scenarios

SecuritybySnyk

Passed

No known issues

Optimize this skill with Tessl

npx tessl skill review --optimize ./skills/interval-difference-analyzer/SKILL.md
SKILL.md
Quality
Evals
Security

Evaluation results

100%

83%

Validate a Financial Calculator Refactoring

Core interval comparison workflow

Criteria
Without context
With context

Analyzer used for old version

0%

100%

Analyzer used for new version

0%

100%

Comparator script invoked

0%

100%

Workflow order correct

0%

100%

Report has summary field

100%

100%

Report has differences field

0%

100%

Report has recommendations field

0%

100%

Differences include type field

20%

100%

Modified intervals detected

50%

100%

Summary counts present

0%

100%

100%

45%

Assess Risk in a Buffer Processing Refactor

Severity classification and behavioral change detection

Criteria
Without context
With context

Negative index flagged critical

0%

100%

Overflow risk flagged critical

70%

100%

Negative value implication present

100%

100%

Overflow implication present

100%

100%

Critical testing priority assigned

30%

100%

Critical recommendation present

50%

100%

Change type widened labeled

0%

100%

Both scripts used

0%

100%

Severity field on all differences

100%

100%

Risk summary references critical findings

100%

100%

100%

9%

Generate Validation Tests After an API Limit Migration

Boundary test case generation from interval diff

Criteria
Without context
With context

Analyzer invoked for both versions

50%

100%

Comparator invoked

50%

100%

Tests target old upper boundary

100%

100%

Tests target new upper boundary

100%

100%

Tests include values in the gap

100%

100%

Tests organized by interval

100%

100%

Comments link tests to interval changes

100%

100%

Tests exceed old boundary by 1

100%

100%

Valid Python test file

100%

100%

Report-driven test values

93%

100%

Repository
ArabelaTso/Skills-4-SE
Evaluated
Agent
Claude Code
Model
Claude Sonnet 4.6

Table of Contents

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.