Marketplace

TDD Process

Strict test-driven development state machine with red-green-refactor cycles. Enforces test-first development, meaningful failures, minimum implementations, and full verification. Activates when user requests: 'use a TDD approach', 'start TDD', 'test-drive this'.

$ Installieren

git clone https://github.com/NTCoding/claude-skillz /tmp/claude-skillz && cp -r /tmp/claude-skillz/tdd-process ~/.claude/skills/claude-skillz

// tip: Run this command in your terminal to install the skill


name: TDD Process description: "Strict test-driven development state machine with red-green-refactor cycles. Enforces test-first development, meaningful failures, minimum implementations, and full verification. Activates when user requests: 'use a TDD approach', 'start TDD', 'test-drive this'." version: 1.0.0

In Plan Mode: Plans should be test specifications, not implementation designs. Include key insights, architectural constraints, and suggestionsโ€”but never the full implementation of production code.

๐Ÿšจ CRITICAL: TDD STATE MACHINE GOVERNANCE ๐Ÿšจ

EVERY SINGLE MESSAGE MUST START WITH YOUR CURRENT TDD STATE

Format:

๐Ÿ”ด TDD: RED
๐ŸŸข TDD: GREEN
๐Ÿ”ต TDD: REFACTOR
โšช TDD: PLANNING
๐ŸŸก TDD: VERIFY
โš ๏ธ TDD: BLOCKED

NOT JUST THE FIRST MESSAGE. EVERY. SINGLE. MESSAGE.

When you read a file โ†’ prefix with TDD state When you run tests โ†’ prefix with TDD state When you explain results โ†’ prefix with TDD state When you ask a question โ†’ prefix with TDD state

Example:

โšช TDD: PLANNING
Writing test for negative price validation...

โšช TDD: PLANNING
Running npm test to see it fail...

โšช TDD: PLANNING
Test output shows: Expected CannotHaveNegativePrice error but received -50
Test fails correctly. Transitioning to RED.

๐Ÿ”ด TDD: RED
Test IS failing. Addressing what the error message demands...

๐Ÿšจ FAILURE TO ANNOUNCE TDD STATE = SEVERE VIOLATION ๐Ÿšจ


<meta_governance> ๐Ÿšจ STRICT STATE MACHINE GOVERNANCE ๐Ÿšจ

  • CANNOT skip states or assume completion without evidence
  • MUST announce state on EVERY message
  • MUST validate post-conditions before transitioning

Before each response: Verify your claimed state matches your tool call evidence. If mismatch: ๐Ÿ”ฅ STATE VIOLATION DETECTED โ†’ announce correct state โ†’ recover.

State announcement: Every message starts with ๐Ÿ”ด TDD: RED (or current state). Forgot prefix? Announce violation immediately, then continue. </meta_governance>

<state_machine>

                  user request
                       โ†“
                 โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
            โ”Œโ”€โ”€โ”€โ”€โ”‚ PLANNING โ”‚โ”€โ”€โ”€โ”€โ”
            โ”‚    โ””โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”˜    โ”‚
            โ”‚          โ”‚         โ”‚
            โ”‚  test fails        โ”‚
            โ”‚  correctly         โ”‚
  unclear   โ”‚          โ†“         โ”‚ blocker
            โ”‚    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
            โ””โ”€โ”€โ”€โ”€โ”‚   RED    โ”‚    โ”‚
                 โ”‚          โ”‚    โ”‚
                 โ”‚ Test IS  โ”‚    โ”‚
                 โ”‚ failing  โ”‚    โ”‚
                 โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜    โ”‚
                      โ”‚          โ”‚
              test    โ”‚          โ”‚
              passes  โ”‚          โ”‚
                      โ†“          โ”‚
                 โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”    โ”‚
                 โ”‚  GREEN   โ”‚    โ”‚
                 โ”‚          โ”‚    โ”‚
                 โ”‚ Test IS  โ”‚    โ”‚
                 โ”‚ passing  โ”‚    โ”‚
                 โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜โ”€โ”€โ”€โ”€โ”˜
                      โ”‚
          refactoring โ”‚
          needed      โ”‚
                      โ†“
                 โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
            โ”Œโ”€โ”€โ”€โ”€โ”‚ REFACTOR โ”‚
            โ”‚    โ”‚          โ”‚
            โ”‚    โ”‚ Improve  โ”‚
            โ”‚    โ”‚ design   โ”‚
            โ”‚    โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜
            โ”‚         โ”‚
            โ”‚    done โ”‚
            โ”‚         โ”‚
            โ”‚         โ†“
            โ”‚    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
            โ”‚    โ”‚  VERIFY  โ”‚
            โ”‚    โ”‚          โ”‚
            โ”‚    โ”‚ Run full โ”‚
  fail      โ”‚    โ”‚ suite +  โ”‚
            โ”‚    โ”‚ lint +   โ”‚
            โ””โ”€โ”€โ”€โ”€โ”‚ build    โ”‚
                 โ””โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”˜
                      โ”‚
                 pass โ”‚
                      โ”‚
                      โ†“
                 [COMPLETE]
  <pre_conditions>
    โœ“ User has provided a task/requirement/bug report
    โœ“ No other TDD cycle in progress
  </pre_conditions>

  <actions>
    1. Analyze requirement/bug
    2. Ask clarifying questions if needed
    3. Determine what behavior needs testing
    4. Identify edge cases
    5. Write test for specific behavior
    6. Run test (use Bash tool to execute test command)
    7. VERIFY test fails correctly
    8. Show exact failure message to user (copy/paste verbatim output)
    9. Justify why failure message proves test is correct
    10. If failure is "method doesn't exist" - implement empty/dummy method and re-run from step 6
    11. Repeat until you get a "meaningful" failure
    12. Improve the code to produce a more explicit error message. Does the test failure provide a precise reason for the failure, if not ask the user if they want to make it better.
    13. Transition to RED
  </actions>

  <post_conditions>
    โœ“ Test written and executed
    โœ“ Test FAILED correctly (red bar achieved)
    โœ“ Failure message shown to user verbatim
    โœ“ Failure reason justified (proves test is correct)
    โœ“ Failure is "meaningful" (not setup/syntax error)
  </post_conditions>

  <validation_before_transition>
    BEFORE transitioning to RED, announce:
    "Pre-transition validation:
    โœ“ Test written: [yes]
    โœ“ Test executed: [yes]
    โœ“ Test failed correctly: [yes]
    โœ“ Failure message shown: [yes - output above]
    โœ“ Meaningful failure: [yes - justification]

    Transitioning to RED - test is now failing for the right reason."
  </validation_before_transition>

  <transitions>
    - PLANNING โ†’ RED (when test fails correctly - red milestone achieved)
    - PLANNING โ†’ BLOCKED (when cannot write valid test)
  </transitions>
</state>

<state name="RED">
  <prefix>๐Ÿ”ด TDD: RED</prefix>
  <purpose>Test IS failing for the right reason. Implement ONLY what the error message demands.</purpose>

  ๐Ÿšจ CRITICAL: You are in RED state - test IS CURRENTLY FAILING. You MUST implement code and see test PASS, code COMPILE, code LINT before transitioning to GREEN.
  DO NOT transition to GREEN until you have:
  1. Implemented ONLY what the error message demands
  2. Executed the test with Bash tool
  3. Seen the SUCCESS output (green bar)
  4. Executed compile check and seen SUCCESS
  5. Executed lint check and seen PASS
  6. Shown all success outputs to the user

  <pre_conditions>
    โœ“ Test written and executed (from PLANNING)
    โœ“ Test IS FAILING correctly (red bar visible)
    โœ“ Failure message shown and justified
    โœ“ Failure is "meaningful" (not setup/syntax error)
  </pre_conditions>

  <actions>
    1. Read the error message - what does it literally ask for?
    2. ๐Ÿšจ MANDATORY SELF-CHECK - announce before implementing:
       "Minimal implementation check:
       - Error demands: [what the error literally says]
       - Could hardcoded value work? [yes/no]
       - If yes: [what hardcoded value]
       - If no: [why real logic is required]"

       Guidelines:
       - If test asserts `x === 5` โ†’ return `5`
       - If test asserts `count === 0` โ†’ return object with `count: 0`
       - If test asserts type โ†’ return minimal stub of that type
       - Only add logic when tests FORCE you to (multiple cases, different inputs)
    3. Implement ONLY what that error message demands (hardcoded if possible)
    4. Do NOT anticipate future errors - address THIS error only
    5. Run test (use Bash tool to execute test command)
    6. VERIFY test PASSES (green bar)
    7. Show exact success message to user (copy/paste verbatim output)
    8. Run quick compilation check (e.g., tsc --noEmit, or project-specific compile command)
    9. Run lint on changed code
    10. If compile/lint fails: Fix issues and return to step 5 (re-run test)
    11. Show compile/lint success output to user
    12. Justify why implementation is minimum
    13. ONLY AFTER completing steps 5-12: Announce post-condition validation
    14. ONLY AFTER validation passes: Transition to GREEN

    ๐Ÿšจ YOU CANNOT TRANSITION TO GREEN UNTIL TEST PASSES, CODE COMPILES, AND CODE LINTS ๐Ÿšจ
  </actions>

  <post_conditions>
    โœ“ Implemented ONLY what error message demanded
    โœ“ Test executed
    โœ“ Test PASSES (green bar - not red)
    โœ“ Success message shown to user verbatim
    โœ“ Code compiles (no compilation errors)
    โœ“ Code lints (no linting errors)
    โœ“ Compile/lint output shown to user
    โœ“ Implementation addresses ONLY what error message demanded (justified)
  </post_conditions>

  <validation_before_transition>
    ๐Ÿšจ BEFORE transitioning to GREEN, verify ALL with evidence from tool history:
    โœ“ Test PASSES (green bar) - show verbatim output
    โœ“ Code compiles - show output
    โœ“ Code lints - show output
    โœ“ Implementation addresses ONLY what error demanded - justify

    If ANY evidence missing: "โš ๏ธ CANNOT TRANSITION - Missing: [what]" โ†’ stay in RED.
  </validation_before_transition>

  <critical_rules>
    ๐Ÿšจ NEVER transition to GREEN without test PASS + compile SUCCESS + lint PASS
    ๐Ÿšจ IMPLEMENT ONLY WHAT THE ERROR MESSAGE DEMANDS - no anticipating future errors
    ๐Ÿšจ DON'T CHANGE TEST TO MATCH IMPLEMENTATION - fix the code, not the test
  </critical_rules>

  <transitions>
    - RED โ†’ GREEN (when test PASSES, code COMPILES, code LINTS - green milestone achieved)
    - RED โ†’ BLOCKED (when cannot make test pass or resolve compile/lint errors)
    - RED โ†’ PLANNING (when test failure reveals requirement was misunderstood)
  </transitions>
</state>

<state name="GREEN">
  <prefix>๐ŸŸข TDD: GREEN</prefix>
  <purpose>Test IS passing for the right reason. Assess code quality and decide next step.</purpose>

  <pre_conditions>
    โœ“ Test exists and PASSES (from RED)
    โœ“ Test IS PASSING for the right reason (green bar visible)
    โœ“ Code compiles (no compilation errors)
    โœ“ Code lints (no linting errors)
    โœ“ Pass output was shown and implementation justified as minimum
  </pre_conditions>

  <actions>
    1. Review the implementation that made test pass
    2. Check code quality against object calisthenics
    3. Check for feature envy
    4. Check for dependency inversion opportunities
    5. Check naming conventions
    6. Decide: Does code need refactoring?
    7a. If YES refactoring needed โ†’ Transition to REFACTOR
    7b. If NO refactoring needed โ†’ Transition to VERIFY
  </actions>

  <post_conditions>
    โœ“ Test IS PASSING (green bar)
    โœ“ Code quality assessed
    โœ“ Decision made: refactor or verify
  </post_conditions>

  <validation_before_transition>
    BEFORE transitioning to REFACTOR or VERIFY, announce:
    "Post-condition validation:
    โœ“ Test IS PASSING: [yes - green bar visible]
    โœ“ Code quality assessed: [yes]
    โœ“ Decision: [REFACTOR needed / NO refactoring needed, go to VERIFY]

    All post-conditions satisfied. Transitioning to [REFACTOR/VERIFY]."

    IF any post-condition NOT satisfied:
    "โš ๏ธ CANNOT TRANSITION - Post-condition failed: [which one]
    Staying in GREEN state to address: [issue]"
  </validation_before_transition>

  <critical_rules>
    ๐Ÿšจ GREEN state means test IS PASSING, code COMPILES, code LINTS - if any fail, you're back to RED
    ๐Ÿšจ NEVER skip code quality assessment
    ๐Ÿšจ NEVER transition if test is not passing
    ๐Ÿšจ NEVER transition if code doesn't compile or lint
    ๐Ÿšจ ALWAYS assess whether refactoring is needed
    ๐Ÿšจ Go to REFACTOR if improvements needed, VERIFY if code is already clean
  </critical_rules>

  <transitions>
    - GREEN โ†’ REFACTOR (when refactoring needed - improvements identified)
    - GREEN โ†’ VERIFY (when code quality satisfactory - no refactoring needed)
    - GREEN โ†’ RED (if test starts failing - regression detected, need new failing test)
  </transitions>
</state>

<state name="REFACTOR">
  <prefix>๐Ÿ”ต TDD: REFACTOR</prefix>
  <purpose>Tests ARE passing. Improving code quality while maintaining green bar.</purpose>

  <pre_conditions>
    โœ“ Tests ARE PASSING (from GREEN)
    โœ“ Code compiles (no compilation errors)
    โœ“ Code lints (no linting errors)
    โœ“ Refactoring needs identified
    โœ“ Pass output was shown
  </pre_conditions>

  <actions>
    1. Analyze code for design improvements
    2. Check against project conventions
    3. Check naming conventions
    4. If improvements needed:
       a. Explain refactoring
       b. Apply refactoring
       c. Run test to verify behavior preserved
       d. Show test still passes
    5. Repeat until no more improvements
    6. Check tests for improvements opportunities - e.g. combine tests with it.each, check against project testing conventions (e.g. `/docs/conventions/testing.md`)
    7. Transition to VERIFY
  </actions>

  <post_conditions>
    โœ“ Code reviewed for quality
    โœ“ Object calisthenics applied
    โœ“ No feature envy
    โœ“ Dependencies inverted
    โœ“ Names are intention-revealing
    โœ“ Tests still pass after each refactor
    โœ“ Test output shown after each refactor
  </post_conditions>

  <validation_before_transition>
    BEFORE transitioning to VERIFY, announce:
    "Post-condition validation:
    โœ“ Object calisthenics: [applied/verified]
    โœ“ Feature envy: [none detected]
    โœ“ Dependencies: [properly inverted]
    โœ“ Naming: [intention-revealing]
    โœ“ Tests pass: [yes - output shown]

    All post-conditions satisfied. Transitioning to VERIFY."
  </validation_before_transition>

  <critical_rules>
    ๐Ÿšจ NEVER refactor without running tests after
    ๐Ÿšจ NEVER use generic names (data, utils, helpers)
    ๐Ÿšจ ALWAYS verify tests pass after refactor
  </critical_rules>

  <transitions>
    - REFACTOR โ†’ VERIFY (when code quality satisfactory)
    - REFACTOR โ†’ RED (if refactor broke test - write new test for edge case)
    - REFACTOR โ†’ BLOCKED (if cannot refactor due to constraints)
  </transitions>
</state>

<state name="VERIFY">
  <prefix>๐ŸŸก TDD: VERIFY</prefix>
  <purpose>Tests ARE passing. Run full test suite + lint + build before claiming complete.</purpose>

  <pre_conditions>
    โœ“ Tests ARE PASSING (from GREEN or REFACTOR)
    โœ“ Code compiles (no compilation errors)
    โœ“ Code lints (no linting errors)
    โœ“ Either: Refactoring complete OR no refactoring needed
  </pre_conditions>

  <actions>
    1. Run full test suite (not just current test)
    2. Capture and show output
    3. Run lint
    4. Capture and show output
    5. Run build
    6. Capture and show output
    7. If ALL pass โ†’ Transition to COMPLETE
    8. If ANY fail โ†’ Transition to BLOCKED or RED
  </actions>

  <post_conditions>
    โœ“ Full test suite executed
    โœ“ All tests PASSED
    โœ“ Test output shown
    โœ“ Lint executed
    โœ“ Lint PASSED
    โœ“ Lint output shown
    โœ“ Build executed
    โœ“ Build SUCCEEDED
    โœ“ Build output shown
  </post_conditions>

  <validation_before_completion>
    BEFORE claiming COMPLETE, announce:
    "Final validation:
    โœ“ Full test suite: [X/X tests passed - output shown]
    โœ“ Lint: [passed - output shown]
    โœ“ Build: [succeeded - output shown]

    All validation passed. TDD cycle COMPLETE.

    Session Summary:
    - Tests written: [count]
    - Refactorings: [count]
    - Violations: [count]
    - Duration: [time]"

    IF any validation FAILED:
    "โš ๏ธ VERIFICATION FAILED
    Failed check: [which one]
    Output: [failure message]

    Routing to: [RED/BLOCKED depending on issue]"
  </validation_before_completion>

  <critical_rules>
    ๐Ÿšจ NEVER claim complete without full test suite
    ๐Ÿšจ NEVER claim complete without lint passing
    ๐Ÿšจ NEVER claim complete without build passing
    ๐Ÿšจ ALWAYS show output of each verification
    ๐Ÿšจ NEVER skip verification steps
  </critical_rules>

  <transitions>
    - VERIFY โ†’ COMPLETE (when all checks pass)
    - VERIFY โ†’ RED (when tests fail - regression detected)
    - VERIFY โ†’ REFACTOR (when lint fails - code quality issue)
    - VERIFY โ†’ BLOCKED (when build fails - structural issue)
  </transitions>
</state>

<state name="BLOCKED">
  <prefix>โš ๏ธ TDD: BLOCKED</prefix>
  <purpose>Handle situations where progress cannot continue</purpose>

  <pre_conditions>
    โœ“ Encountered issue preventing progress
    โœ“ Issue is not user error or misunderstanding
  </pre_conditions>

  <actions>
    1. Clearly explain blocking issue
    2. Explain which state you were in
    3. Explain what you were trying to do
    4. Explain why you cannot proceed
    5. Suggest possible resolutions
    6. STOP and wait for user guidance
  </actions>

  <post_conditions>
    โœ“ Blocker documented
    โœ“ Context preserved
    โœ“ Suggestions provided
    โœ“ Waiting for user
  </post_conditions>

  <critical_rules>
    ๐Ÿšจ NEVER improvise workarounds
    ๐Ÿšจ NEVER skip steps to "unblock" yourself
    ๐Ÿšจ ALWAYS stop and wait for user
  </critical_rules>

  <transitions>
    - BLOCKED โ†’ [any state] (based on user guidance)
  </transitions>
</state>

<state name="VIOLATION_DETECTED">
  <prefix>๐Ÿ”ฅ TDD: VIOLATION_DETECTED</prefix>
  <purpose>Handle state machine violations</purpose>

  <trigger>
    Self-detected violations:
    - Forgot state announcement
    - Skipped state
    - Failed to validate post-conditions
    - Claimed phase complete without evidence
    - Skipped test execution
    - Changed assertion when test failed
    - Changed test assertion to match implementation (instead of fixing implementation)
    - Implemented full solution when hardcoded value would satisfy error
    - Skipped mandatory self-check before implementing
  </trigger>

  <actions>
    1. IMMEDIATELY announce: "๐Ÿ”ฅ STATE VIOLATION DETECTED"
    2. Explain which rule/state was violated
    3. Explain what you did wrong
    4. Announce correct current state
    5. Ask user permission to recover
    6. If approved, return to correct state
  </actions>

  <example>
    "๐Ÿ”ฅ STATE VIOLATION DETECTED

    Violation: Forgot to announce state on previous message
    Current actual state: RED

    Recovering to correct state...

    ๐Ÿ”ด TDD: RED
    [continue from here]"
  </example>
</state>
Example techniques:
- Return wrong value (null, 0, "") to get meaningful assertion failure
- Hardcode expected value to pass, then add more tests to force real logic
- Never jump from "not implemented" to full solution

Concrete example:
- Error: "Not implemented"
- Test expects: componentCount: 0, linkCount: 0
- โŒ WRONG: Implement real resume() with graph parsing logic
- โœ… RIGHT: Return hardcoded stub `{ componentCount: 0, linkCount: 0 }`
- Then: Add more tests to force real implementation

<meta_rule> ALL rules enforced via state machine post-conditions. Skip validation โ†’ VIOLATION_DETECTED </meta_rule>

<critical_reminders> ๐Ÿšจ EVERY message: state announcement ๐Ÿšจ NEVER skip transitions or claim pass/fail without output ๐Ÿšจ ALWAYS justify: does this address ONLY what error message demanded? ๐Ÿšจ NEVER change assertions to make tests pass ๐Ÿšจ NEVER guess - find evidence </critical_reminders>