CtrlK
BlogDocsLog inGet started
Tessl Logo

giuseppe-trisciuoglio/developer-kit

Comprehensive developer toolkit providing reusable skills for Java/Spring Boot, TypeScript/NestJS/React/Next.js, Python, PHP, AWS CloudFormation, AI/RAG, DevOps, and more.

90

Quality

90%

Does it follow best practices?

Impact

Pending

No eval scenarios have been run

SecuritybySnyk

Risky

Do not use without reviewing

This version of the tile failed moderation
Moderation pipeline encountered an internal error
Overview
Quality
Evals
Security
Files

specs.task-tdd.mdplugins/developer-kit-specs/commands/

description:
Provides Test-Driven Development (TDD) test generation capability for RED phase before implementation. Use when needing to generate failing tests from task specifications. Generates test files based on task requirements and verifies RED phase before implementation.
argument-hint:
[ --lang=java|spring|typescript|nestjs|react|python|php|general ] [ --task="docs/specs/XXX-feature/tasks/TASK-XXX.md" ]
allowed-tools:
Task, Read, Write, Edit, Bash, Grep, Glob, TodoWrite, AskUserQuestion
model:
inherit

TDD Test Generation

Generate failing tests (RED phase) from task specifications before implementation, following Test-Driven Development principles within the Specification-Driven Development workflow.

Overview

This command implements the RED phase of TDD by generating test files that reflect task requirements and verifying they fail before implementation begins. This ensures tests drive implementation rather than the reverse.

Workflow Position:

brainstorm → spec-to-tasks → task-tdd (RED) → task-implementation (GREEN) → task-review → code-cleanup → spec-sync-with-code
                         ↑                    ↑
                    (this command)      (implementation)

Input: Task file generated by /specs:spec-to-tasks Output: Generated test files that fail (RED phase verification)

What This Command Does

  1. Read Task Specification: Parse task file to extract requirements, acceptance criteria, and technical context
  2. Generate Test Skeleton: Create appropriate test file structure for the specified language/framework
  3. Write Failing Tests: Generate test cases based on task acceptance criteria that will fail
  4. Verify RED Phase: Execute tests to confirm they fail as expected (RED phase confirmation)
  5. Update Task File: Record test references, RED-phase status, and a generation summary in the task file
  6. Prepare for Implementation: Provide clear handoff to /specs:task-implementation with test files ready

Task parsing and validation are handled by plugins/developer-kit-specs/hooks/specs-task-tdd-parser.py, which validates task frontmatter and extracts the Test Instructions section before generation proceeds. Shared RED-phase skeleton rendering, naming, and file location logic are handled by plugins/developer-kit-specs/hooks/specs-task-tdd-generator.py. Executable RED verification is handled by plugins/developer-kit-specs/hooks/specs-task-tdd-red-phase.py, which selects the language-appropriate runner, executes the generated test file, and reports red-confirmed vs unexpected-pass. Task file updates are handled by plugins/developer-kit-specs/hooks/specs-task-tdd-updater.py, which preserves the task body, merges testReferences metadata into frontmatter, and writes a replaceable RED-phase summary block. Implementation handoff preparation is handled by plugins/developer-kit-specs/hooks/specs-task-tdd-handoff.py, which creates _drift/tdd-handoff-*.md in the spec folder with the generated test summary, RED status, preserved task context, and the exact /specs:task-implementation command for GREEN phase.

Usage

# Basic usage - generate tests for a specific task
/specs:task-tdd --task="docs/specs/001-hotel-search/tasks/TASK-003.md"

# With language specification
/specs:task-tdd --lang=spring --task="docs/specs/001-user-auth/tasks/TASK-005.md"
/specs:task-tdd --lang=typescript --task="docs/specs/001-api-integration/tasks/TASK-012.md"
/specs:task-tdd --lang=nestjs --task="docs/specs/001-user-auth/tasks/TASK-008.md"
/specs:task-tdd --lang=react --task="docs/specs/001-ui-components/tasks/TASK-015.md"
/specs:task-tdd --lang=python --task="docs/specs/001-data-processing/tasks/TASK-007.md"
/specs:task-tdd --lang=php --task="docs/specs/001-wordpress-plugin/tasks/TASK-009.md"

# General language (template structure, language-agnostic)
/specs:task-tdd --lang=general --task="docs/specs/002-tdd-command/tasks/TASK-001.md"

Arguments

ArgumentDescriptionRequired
--taskPath to task file (from spec-to-tasks)Yes
--langProgramming language/frameworkYes

Supported Languages

Language--lang ValueTest FrameworkFile Location
Java/Springspring or javaJUnit 5, Mockitosrc/test/java/
TypeScript/Node.jstypescript or tsJest, Mocha__tests__/, *.test.ts
NestJSnestjsJest*.spec.ts
ReactreactJest, React Testing Library*.test.tsx, *.spec.tsx
Pythonpython or pypytesttests/, test_*.py
PHPphpPHPUnittests/, *Test.php
GeneralgeneralTemplate structureLanguage-agnostic

TDD Workflow Integration

This command bridges the gap between specification and implementation:

  1. After spec-to-tasks: Task files are generated with acceptance criteria
  2. Run task-tdd (this command): Generate failing tests from acceptance criteria
  3. Verify RED Phase: Tests execute and fail (expected behavior)
  4. Run task-implementation: Implement code to make tests pass (GREEN phase)
  5. Run task-review: Verify implementation meets requirements

Examples by Language

Java/Spring Boot

# Generate tests for a Spring service task
/specs:task-tdd --lang=spring --task="docs/specs/001-user-service/tasks/TASK-004.md"

# Output:
# - src/test/java/com/hotels/user/UserServiceTest.java (failing tests)
# - RED phase verification: Tests fail (expected)

TypeScript/Node.js

# Generate tests for an API endpoint task
/specs:task-tdd --lang=typescript --task="docs/specs/001-api-gateway/tasks/TASK-007.md"

# Output:
# - __tests__/api-gateway.test.ts (failing tests)
# - RED phase verification: Tests fail (expected)

NestJS

# Generate tests for a NestJS controller task
/specs:task-tdd --lang=nestjs --task="docs/specs/001-payment-service/tasks/TASK-006.md"

# Output:
# - src/payment/payment.controller.spec.ts (failing tests)
# - RED phase verification: Tests fail (expected)

React

# Generate tests for a React component task
/specs:task-tdd --lang=react --task="docs/specs/001-search-ui/tasks/TASK-011.md"

# Output:
# - src/components/SearchBar.test.tsx (failing tests)
# - RED phase verification: Tests fail (expected)

Python

# Generate tests for a Python service task
/specs:task-tdd --lang=python --task="docs/specs/001-data-processor/tasks/TASK-008.md"

# Output:
# - tests/test_data_processor.py (failing tests)
# - RED phase verification: Tests fail (expected)

PHP

# Generate tests for a WordPress plugin task
/specs:task-tdd --lang=php --task="docs/specs/001-wp-plugin/tasks/TASK-009.md"

# Output:
# - tests/PluginTest.php (failing tests)
# - RED phase verification: Tests fail (expected)

RED Phase Verification

This command verifies RED status through the dedicated verifier hook:

# After test generation, run:
python3 plugins/developer-kit-specs/hooks/specs-task-tdd-red-phase.py \
  --task="docs/specs/.../tasks/TASK-XXX.md" \
  --lang=python \
  --project-root=.

# Result statuses:
# - red-confirmed: generated tests fail as expected
# - unexpected-pass: generated tests passed unexpectedly (A4 flow)
# - execution-timeout / error JSON: RED verification could not complete

If tests accidentally pass (should not happen), the command will:

  • Warn that RED phase is not achieved
  • Suggest reviewing test assertions
  • Ask whether to proceed with implementation anyway

Handoff to Implementation

After RED phase verification, this command provides a clear handoff:

# Next step: Implement code to make tests pass
/specs:task-implementation --lang=spring --task="docs/specs/001-feature/tasks/TASK-004.md"

The implementation command will:

  1. Read the generated test files
  2. Implement code to make tests pass (GREEN phase)
  3. Verify tests pass after implementation

Before that handoff, run the updater hook so the task file records:

python3 plugins/developer-kit-specs/hooks/specs-task-tdd-updater.py \
  --task="docs/specs/.../tasks/TASK-XXX.md" \
  --lang=python \
  --project-root=.

This updates the task file with:

  • generated test file path and test type
  • RED-phase verification result and runner metadata
  • a concise summary section that preserves the original Test Instructions block for traceability

Then prepare the implementation handoff artifact:

python3 plugins/developer-kit-specs/hooks/specs-task-tdd-handoff.py \
  --task="docs/specs/.../tasks/TASK-XXX.md" \
  --lang=python \
  --project-root=.

This creates _drift/tdd-handoff-task-xxx.md with:

  • a summary of the generated failing tests
  • documented test locations and types
  • RED-phase confirmation status
  • preserved task/spec context for GREEN phase
  • the recommended /specs:task-implementation command to run next

Integration with SDD Workflow

This command enhances the existing SDD workflow by adding explicit TDD support:

Without TDD Command:

brainstorm → spec-to-tasks → task-implementation → task-review
                         (implementation first, tests later)

With TDD Command:

brainstorm → spec-to-tasks → task-tdd → task-implementation → task-review
                         (tests first, implementation second)

Both workflows are valid—the TDD command is optional and teams can choose whether to adopt TDD practices.

Test Generation Principles

Generated tests follow these principles:

  1. Specification-Derived: Tests are based on task acceptance criteria, not implementation details
  2. Failing by Design: Tests will fail because implementation doesn't exist yet (RED phase)
  3. Language-Appropriate: Test structure matches framework conventions for the specified language
  4. Readable: Test names and assertions clearly express requirements
  5. Maintainable: Tests serve as living documentation of expected behavior

Troubleshooting

Tests Don't Fail After Generation

If generated tests pass instead of fail:

# Possible causes:
# 1. Implementation already exists (tests are testing existing code)
# 2. Test assertions are incorrect (tests aren't properly failing)
# 3. Mock/stub setup is incorrect (tests mock non-existent dependencies)

# Solution: Review test file and adjust assertions

Language Not Supported

If your language/framework isn't listed:

# Use --lang=general for template structure
/specs:task-tdd --lang=general --task="docs/specs/001-feature/tasks/TASK-001.md"

# Output: Language-agnostic test template that you can adapt

Task File Not Found

# Error: Task file not found
# Solution: Ensure you've run spec-to-tasks first
/specs:spec-to-tasks docs/specs/001-feature/

# Then run task-tdd
/specs:task-tdd --lang=spring --task="docs/specs/001-feature/tasks/TASK-001.md"

Related Commands

  • /specs:brainstorm - Create functional specification from idea
  • /specs:spec-to-tasks - Convert specification into executable tasks
  • /specs:task-implementation - Implement a specific task (GREEN phase)
  • /specs:task-review - Verify implementation meets requirements
  • /developer-kit-specs:specs-code-cleanup - Final cleanup after review approval

plugins

CHANGELOG.md

context7.json

CONTRIBUTING.md

README_CN.md

README_ES.md

README_IT.md

README.md

tessl.json

tile.json