Preflight
Run a pre-submission checklist before the contributor hits "Create Pull Request." Only applies to contributions to external open source projects — skip for internal or personal projects. This skill assumes recon has been done and the venue decision has been made. If a PR is not the right venue, this skill should not be running — go back to propose.
Check 1: AI policy compliance
Based on recon findings:
- If the project requires AI disclosure, verify it is present in the correct format (commit tag, PR description section, checkbox, etc.).
- If the project bans AI contributions, stop. Do not proceed.
- If the project has no AI policy, STILL add voluntary disclosure to the PR description. This is not optional — transparency builds trust and prevents the appearance of hiding AI involvement. Draft a disclosure section for the contributor.
Disclosure should be specific: which tool, what it was used for, what was human-written vs. AI-assisted.
| Quality | Example |
|---|
| ❌ Insufficient | "I used AI to help write this." |
| ✅ Acceptable | "Code drafted with Claude Code, reviewed and modified by me, tests written manually." |
Research basis: Finding 5
Check 2: Diff size and focus
Compare the PR's diff against typical PR size from recon findings:
- Flag if the diff exceeds 2x the project's typical PR size from recon findings (or 500 lines if no baseline is available).
- Flag if the PR touches files unrelated to the stated issue.
- Flag if the PR bundles multiple unrelated changes.
- If oversized, help the contributor decompose into smaller PRs.
Research basis: Finding 4, Finding 16
Check 3: PR template compliance
If the project has a PR template:
- Verify every section is filled. Do not delete template sections.
- Verify the linked issue reference is present.
- Verify any required checkboxes (AI disclosure, testing confirmation, CLA) are addressed.
If no template exists, verify the PR description at minimum contains:
- What changed and why
- Reference to the issue being addressed
- How to test the change
Research basis: Finding 1
Check 4: Style and convention compliance
Verify the contribution matches project conventions discovered during recon:
- Code formatting matches
.editorconfig, linter, and formatter configs.
- Commit messages match the project's convention (Conventional Commits, imperative mood, signed-off, etc.).
- Import ordering, naming conventions, and comment style match existing code.
- If no config files exist, check 3-5 existing files in the same directory for: indentation (spaces vs tabs, 2 vs 4), line length, comment style, naming conventions (camelCase vs snake_case). Match the majority pattern.
Research basis: Finding 9
Check 5: Tests and CI
- Verify tests are included if the project expects them.
- Verify tests match the project's testing patterns (framework, file naming, assertion style, fixtures).
- Verify the contributor has run the project's CI checks locally (linters, test suite, build) and they pass.
- Do not rely on CI to catch problems — that shifts the burden to maintainers.
Research basis: Finding 4
Check 5.5: Changelog and metadata
- If
CHANGELOG.md exists and CONTRIBUTING.md mentions updating it, verify the contributor has added an entry under [Unreleased] for the change.
- Check for other metadata requirements:
AUTHORS file updates, version bumps, etc.
- This is one of the most commonly missed steps — agents fix the code but forget the housekeeping.
Check 6: Legal requirements
- If DCO sign-off is required, verify the contributor (not the agent) has added
Signed-off-by: to commits.
- If CLA is required, remind the contributor they will need to sign it.
- If the license has compatibility concerns with AI tool terms, flag them.
Research basis: Finding 10
Check 7: Agent artifact cleanup
Verify the contribution does not include:
.claude/, .cursor/, .aider/, or other agent tool directories
- AI-generated comments that describe intent rather than reality (e.g., "This function efficiently handles..." when it doesn't)
- Characteristic AI verbosity in code comments, commit messages, or PR description
- Hallucinated package names or dependencies that don't exist
To detect leaked agent directories, run:
find . -type d \( -name '.claude' -o -name '.cursor' -o -name '.aider' -o -name '.continue' \) ! -path './.git/*'
Any output from this command is a blocker — remove those directories before submission.
Research basis: Finding 13, Finding 15
Check 8: Slop detector patterns
Verify the contribution does not exhibit patterns flagged by automated slop detectors:
- Multiple PRs opened in rapid succession across different repos
- Large unfocused diffs with AI formatting tells (excessive blank lines, verbose comments, placeholder TODOs, unused imports, code that doesn't match surrounding style)
- PR submitted by a contributor with no prior engagement in the project
- Fire-and-forget pattern (warn the contributor they MUST stay engaged through review)
Research basis: Finding 12
Check 9: Human ownership verification
Before submitting, confirm with the contributor:
- Can you explain every change in this PR without AI assistance?
- Can you respond to reviewer questions about design decisions personally?
- Are you committed to iterating on review feedback until the PR converges or is withdrawn?
If the answer to any of these is no, the PR is not ready.
Research basis: Finding 6, Finding 7
Produce the preflight report
Summarize as a checklist with pass/fail/warning for each check. For any failures, explain what needs to change before submission. For warnings, explain the risk and let the contributor decide.
Do not submit the PR. That is the contributor's action.