CtrlK
BlogDocsLog inGet started
Tessl Logo

qa-testing-methodology

QA test design patterns (equivalence partitioning, boundary analysis, accessibility). Auto-loads when designing test cases, planning test coverage, or writing test procedures.

Install with Tessl CLI

npx tessl i github:jpoutrin/product-forge --skill qa-testing-methodology
What are skills?

83

Does it follow best practices?

Validation for skill structure

SKILL.md
Review
Evals

QA Testing Methodology Skill

Apply proven test design patterns for comprehensive test coverage.

Test Case Design Order

Always design tests in this order:

  1. Happy Path - Main success scenarios
  2. Validation Tests - Required fields, format validation
  3. Edge Cases - Boundary conditions, limits
  4. Error Scenarios - Invalid inputs, system errors
  5. Permission Tests - Access control, authorization

Equivalence Partitioning

Divide input data into partitions where all values should behave identically.

Example: Age Input Field (18-65)

PartitionValuesExpected Behavior
Below minimum0-17Reject with "Must be 18+"
Valid range18-65Accept
Above maximum66+Reject with "Maximum age is 65"
Invalid-1, "abc", emptyReject with validation error

Test Strategy: Test ONE value from each partition, not every value.

Boundary Value Analysis

Focus testing on boundaries where behavior changes.

Example: Password (8-20 characters)

BoundaryTest ValuesExpected
Just below minimum7 charsReject
At minimum8 charsAccept
Just above minimum9 charsAccept
Just below maximum19 charsAccept
At maximum20 charsAccept
Just above maximum21 charsReject

Test Prioritization Matrix

Prioritize tests based on risk and frequency:

PriorityRiskUser ImpactTest Frequency
CriticalData loss, security breachAll users blockedEvery build
HighFeature brokenMajor workflow impactedEvery release
MediumInconvenientWorkaround availableWeekly
LowMinor annoyanceCosmetic issuesMonthly

Accessibility Testing Checklist

Include in every test procedure:

Keyboard Navigation

  • All interactive elements focusable with Tab
  • Focus order is logical (left-to-right, top-to-bottom)
  • Focus indicator is visible
  • No keyboard traps

Screen Reader

  • All images have alt text
  • Form fields have labels
  • Error messages announced
  • Headings structured correctly (h1 → h2 → h3)

Visual

  • Color contrast meets WCAG AA (4.5:1 for text)
  • Information not conveyed by color alone
  • Text resizable to 200% without loss
  • No content flashes more than 3 times/second

Performance Considerations

Note performance during manual testing:

MetricAcceptableNeeds Investigation
Page load< 3 seconds> 3 seconds
Button response< 100ms> 300ms
Form submission< 2 seconds> 5 seconds
Search results< 1 second> 2 seconds

Test Data Guidelines

DO

  • Use realistic but fake data
  • Document exact test data in test cases
  • Use consistent test accounts
  • Reset test data between runs when needed

DON'T

  • Use production data
  • Use generic placeholders ("enter something")
  • Share test credentials in plain text
  • Assume data from previous tests exists

Test Data Examples

Email: test.user@example.com
Password: Test@1234! (in password manager)
Phone: +1-555-0100 (test range)
Credit Card: 4111-1111-1111-1111 (test card)
Address: 123 Test Street, Test City, TS 12345

Regression Testing

When to run regression tests:

TriggerRegression Scope
Bug fixRelated feature + integration points
New featureAll features that share data/UI
Dependency updateFull regression
Release candidateCritical + High priority tests

State-Based Testing

Test all valid state transitions:

Example: Order Status
┌─────────┐     ┌─────────┐     ┌─────────┐     ┌───────────┐
│ Created │ ──▶ │ Pending │ ──▶ │ Shipped │ ──▶ │ Delivered │
└─────────┘     └─────────┘     └─────────┘     └───────────┘
                     │                               │
                     ▼                               ▼
               ┌───────────┐                  ┌────────────┐
               │ Cancelled │                  │  Returned  │
               └───────────┘                  └────────────┘

Test:

  • Valid transitions (Created → Pending)
  • Invalid transitions (Delivered → Created)
  • Edge cases (Cancel while shipping)

Error Message Verification

Check error messages for:

  1. Clarity: User understands what went wrong
  2. Actionability: User knows how to fix it
  3. Tone: Professional, not blaming
  4. Security: No sensitive information exposed

Good vs Bad Error Messages

BadGood
"Error 500""Something went wrong. Please try again."
"Invalid input""Email must be in format: name@example.com"
"User not found in database""No account found with this email"
"Password must match regex...""Password needs 8+ characters with a number"

Cross-Browser Testing Matrix

Minimum browser coverage:

BrowserDesktopMobile
ChromeLatest, Latest-1Android
SafariLatestiOS
FirefoxLatest-
EdgeLatest-

Test Documentation Patterns

When to Screenshot

  • Initial state before test
  • After critical actions
  • Error states
  • Final/success state
  • Any unexpected behavior

Writing Clear Steps

Bad: Click the button Good: Click the blue "Submit" button in the bottom-right of the form

Bad: Enter your details Good: Enter "test@example.com" in the Email field

Bad: Verify it works Good: Verify success message "Order placed successfully" appears

Mobile Testing Considerations

  • Test touch targets (minimum 44x44 pixels)
  • Test swipe gestures where applicable
  • Test orientation changes (portrait ↔ landscape)
  • Test with on-screen keyboard visible
  • Test with poor network (3G simulation)
Repository
jpoutrin/product-forge
Last updated
Created

Is this your skill?

If you maintain this skill, you can claim it as your own. Once claimed, you can manage eval scenarios, bundle related skills, attach documentation or rules, and ensure cross-agent compatibility.