Repository: https://github.com/Rebooted-Dev/Workflow-Scripts
This directory contains structured workflow instructions for common development tasks in the host project. These workflows provide consistent, repeatable processes for planning, reviewing, implementing, debugging, and documenting code changes.
Sharing Workflows Across Projects: See SHARING_AND_SYNC.md for the supported sync models. The recommended approach is a multi-repo setup: this directory is a local clone inside each host project (e.g. Workflow-Scripts/ or workflows/), as its own git repository. The host project has multiple repositories (main app repo + this workflows repo); the main project's .gitignore must list this directory so the main repo does not track it.
Why Workflows?
Workflows ensure that:
- Tasks are completed systematically and thoroughly
- Quality standards are consistently applied
- Priority and severity are assessed uniformly
- Documentation stays accurate and up-to-date
- Team members can follow the same processes
The workflows are organized into eight categories:
-
Orchestrator (
00-Meta-Workflow/00-orchestrator/) - NEW Launch non-interactive OpenCode processes to delegate workflows to different models -
Initial Setup (
00-project-setup/) - Set up new projects with dual repository management and troubleshooting system -
Planning (
01-Planning & Organizing/) - Create structured implementation plans -
Build/Code (
02-code-build/) - Execute implementation with verification -
Debugging (
03-debugging/) - Systematically identify and fix bugs -
Documentation (
04-documentation/) - Keep documentation in sync with code -
Review/Audit (
05-review/) - Review code and plans for quality, correctness, and security -
Security (
06-security/) - Security reviews and vulnerability fixes -
Meta (
00-Meta-Workflow/00-meta/) - Templates, rubrics, and analysis documents about these workflows -
Docs (
00-Meta-Workflow/00-docs/) - Code review reports, archived reviews, and analysis documents generated by workflows
Note: The 00-meta/ directory contains templates, rubrics, and analysis/review documents about the workflows themselves (e.g., sync summary template, severity-priority rubric, parallel agent usage reviews, filename reviews). These are not workflow instructions but rather supporting documents for workflow design and execution.
| Task | Workflow | Location |
|---|---|---|
| Automated plan review (different model) | Orchestrator Review | 00-Meta-Workflow/00-orchestrator/orchestrator-review.sh ← DELEGATED |
| Setting up new project | Project Setup | 00-project-setup/01-setup-project.md |
| Researching & creating plan | Research & Plan | 01-Planning & Organizing/00-research-and-plan.md ← START HERE |
| Starting a new feature | Implementation Plan | 01-Planning & Organizing/02-finalise-plan.md |
| Reviewing a plan | Plan Review | 01-Planning & Organizing/01-plan-review.md |
| Optimizing code performance | Code Optimization | 05-review/02-code-optimization.md |
| Refactoring code | Code Refactoring | 05-review/03-code-refactoring.md |
| Security audit | Security Review | 06-security/01-security-review.md |
| Implementing changes | Execution | 02-code-build/01-execution.md |
| Fixing a bug | Bug Fix | 03-debugging/02-bug-fix-workflow.md |
| Fixing security issues | Security Fix | 06-security/02-security-fix.md |
| Updating docs | Sync Documentation | 04-documentation/02-sync-documentation.md |
Code Review vs Security Review:
- Use Code Review for routine pre-merge checks and general code quality. Includes basic security scanning.
- Use Security Review for dedicated security audits, quarterly assessments, or security-critical changes.
- Timing: Run Code Review before every merge. Run Security Review quarterly, after auth/data changes, or before releases handling sensitive data.
1. Planning → Create implementation plan
2. Review → Review the plan for issues
3. Planning → Refine plan based on feedback
4. Development → Implement changes
5. Review → Code review before merge
6. Documentation → Update docs if needed
Purpose: Launch non-interactive OpenCode processes to perform plan reviews using a different model, capture output to files, and manage the workflow from an orchestrator.
When to use:
- You want to use a different model for plan review than your main orchestrator
- Need to run reviews in batch or automated mode
- Want to parallelize reviews across multiple models
- Need CI/CD integration for automated plan validation
How to use:
- Use the shell script:
./00-Meta-Workflow/00-orchestrator/orchestrator-review.sh plans/my-plan.md -m openai/gpt-4o - The script launches OpenCode non-interactively with specified model
- Review output is captured to
plans/reviews/directory - Orchestrator manages the results and next steps
Key benefits:
- Use lightweight models for quick scans, powerful models for deep analysis
- Run multiple reviews in parallel (security, architecture, general)
- Integrate with CI/CD pipelines
- Capture structured output for further processing
Purpose: Conduct deep research and create a comprehensive initial implementation plan from a goal or problem statement.
When to use:
- You have a goal but no detailed plan yet
- Starting a new feature, refactor, or significant change
- Need to research approaches before committing to a solution
How to use:
- Provide the goal or problem statement
- The workflow will research the codebase and external options
- Creates an implementation plan in
plans/directory - Then proceed to plan review
Purpose: Generate a consolidated, priority-ordered implementation plan from requirements and feedback.
When to use:
- Starting a new feature or refactoring
- Consolidating multiple planning documents
- Creating a roadmap for complex changes
How to use:
- Provide the primary plan document path
- Include any feedback or review comments
- The workflow will generate a priority-ordered plan (P0 → P3)
- Plan is saved to
plans/(project root) with a dated filename
Example:
User: "Create an implementation plan for adding user authentication.
Review the existing plan at plans/auth-plan.md and incorporate
feedback from the team."
Workflow will:
- Read plans/auth-plan.md
- Analyze codebase for feasibility
- Generate priority-ordered phases
- Save to plans/implementation-plan-auth-YYMMDD-HHMM-{model}.md
Key Features:
- Priority ordering (P0 = blockers, P3 = backlog)
- Dependency mapping
- Effort estimation (Small/Medium/Large)
- Clear exit criteria per phase
- Uses shared rubric from
00-Meta-Workflow/00-meta/severity-priority-rubric.md
Purpose: Perform structured code review identifying defects, risks, and refactoring opportunities.
When to use:
- Before merging code changes
- Periodic code quality audits
- After major refactoring
How to use:
- Specify repository root (or specific focus areas)
- Workflow scans codebase using parallel agents
- Findings are scored with severity (S0-S3) and priority (P0-P3)
- Report saved to
plans/(project root) with a dated filename
Example:
User: "Perform a code review focusing on security and error handling
in the services/ directory."
Workflow will:
- Scan services/ directory
- Identify security issues, bugs, and risks
- Score each finding (S0-S3, P0-P3)
- Generate report: plans/code-review-YYMMDD-HHMM-{model}.md
Output Format:
- Summary with top P0/P1 risks
- Findings ordered by priority, then severity
- Each finding includes:
- File path and line reference
- Observed behavior and impact
- Severity/priority with rationale
- Suggested fix
- Verification steps
Purpose: Perform structured analysis to identify performance bottlenecks, resource inefficiencies, and optimization opportunities.
When to use:
- When performance issues are reported or suspected
- Before scaling or handling increased load
- Periodic performance audits
- When bundle size or resource usage is a concern
How to use:
- Specify repository root (or specific focus areas)
- Workflow scans codebase using parallel agents focused on performance
- Findings are scored with severity (S0-S3) and priority (P0-P3)
- Report saved to
plans/(project root) with a dated filename
Example:
User: "Analyze performance bottlenecks in the data processing pipeline."
Workflow will:
- Scan data processing code
- Identify performance issues, resource inefficiencies
- Score each finding (S0-S3, P0-P3)
- Generate report: plans/code-optimization-YYMMDD-HHMM-{model}.md
Output Format:
- Summary with top P0/P1 performance risks
- Findings ordered by priority, then severity
- Each finding includes:
- File path and line reference
- Current performance characteristics
- Observed impact (latency, throughput, resource usage)
- Severity/priority with rationale
- Suggested optimization with expected improvement
- Verification steps to measure improvement
Focus Areas:
- Algorithm complexity and efficiency
- Database query optimization
- Network request batching and caching
- Memory usage and leaks
- Bundle size and code splitting
- Rendering performance
- Concurrent operations
Purpose: Perform structured analysis to identify code quality issues, technical debt, and refactoring opportunities.
When to use:
- When code becomes difficult to maintain or extend
- Before adding new features to complex areas
- Periodic code quality audits
- When technical debt is accumulating
How to use:
- Specify repository root (or specific focus areas)
- Workflow scans codebase using parallel agents focused on code quality
- Findings are scored with severity (S0-S3) and priority (P0-P3)
- Report saved to
plans/(project root) with a dated filename
Example:
User: "Identify refactoring opportunities in the authentication module."
Workflow will:
- Scan authentication code
- Identify code duplication, complexity, maintainability issues
- Score each finding (S0-S3, P0-P3)
- Generate report: plans/code-refactoring-YYMMDD-HHMM-{model}.md
Output Format:
- Summary with top P0/P1 refactoring priorities
- Findings ordered by priority, then severity
- Each finding includes:
- File path and line reference
- Current code structure and issue description
- Impact on maintainability, readability, extensibility
- Severity/priority with rationale
- Suggested refactoring approach with rationale
- Verification steps to ensure behavior is preserved
Focus Areas:
- Code duplication and DRY violations
- Long functions and files (complexity)
- Poor naming and unclear abstractions
- Tight coupling and low cohesion
- Missing or inappropriate design patterns
- Inconsistent code style
- Dead code and unused dependencies
Purpose: Review implementation plans for correctness, risk, feasibility, and completeness.
When to use:
- Before starting implementation
- When a plan seems incomplete or risky
- After receiving a plan from another team member
How to use:
- Provide the plan document path
- Workflow analyzes the plan against the codebase
- Feedback is appended to the plan document
- Findings are priority-ordered (P0 → P3)
Example:
User: "Review the plan at plans/feature-x-260118-1430-claude.md"
Workflow will:
- Read and analyze the plan
- Validate technical feasibility
- Identify design flaws, risks, missing steps
- Append feedback section to the plan file
Output Format:
- Addendum appended to plan with dated header
- Sections: P0, P1, P2, P3
- Each item includes:
- Severity and priority
- Rationale with evidence
- Actionable fix or alternative
- File/line references when applicable
Purpose: Execute implementation in phases with verification and documentation.
When to use:
- Implementing features from a plan
- Making code changes that need verification
- Ensuring changes are properly documented
How to use:
- Confirm goal and acceptance criteria
- Check repository state (
git status) - Break work into phases with exit criteria
- For each phase:
- Implement smallest change
- Verify (
npm run build,npm run dev) - Update task list and changelog
- Final verification before completion
Example:
User: "Implement the user authentication feature from the plan.
Start with Phase 1: API integration."
Workflow will:
- Read the implementation plan
- Implement Phase 1 changes
- Run npm run build
- Test in dev server
- Update the changelog (`docs/CHANGELOG.md` preferred)
- Update task list with checkboxes (`- [✅]` for completed, `- [ ]` for pending)
- Proceed to next phase
Phase Structure:
- Phase Definition: Scope, out-of-scope, exit criteria
- Implement: Smallest change that satisfies scope
- Verify: Build, test, fix if needed
- Report: Update changelog, troubleshooting log, task list
Task List Format:
- [✅]Completed items (green check mark; mark immediately after completion)- [ ]Pending items (not yet started or in progress)
Documentation Updates:
- Changelog: update
docs/CHANGELOG.md(preferred) orCHANGELOG.mdwith- YYYY-MM-DD: Description - Troubleshooting: add an entry under
troubleshooting/and updatetroubleshooting/index.md
Purpose: Systematically identify and fix bugs using hypothesis-driven investigation.
When to use:
- When a bug is reported or discovered
- When tests are failing
- When unexpected behavior occurs
How to use:
- Gather information: logs, screenshots, reproduction steps
- Formulate hypotheses about root cause
- Investigate using parallel agents
- Identify problem and root cause
- Create implementation and testing plan
- Implement fix
- Verify with tests
- Update changelog and troubleshooting log
Example:
User: "The image generation is failing with error 'API key invalid'.
Here are the logs: [logs]"
Workflow will:
- Analyze logs and error message
- Form hypotheses (key format, env var, service config)
- Investigate codebase with parallel agents
- Identify root cause
- Create fix plan
- Implement fix
- Test verification
- Add a `troubleshooting/` entry and update `troubleshooting/index.md`
Process:
- Intake: Gather logs, screenshots, reproduction steps
- Hypothesis: Formulate likely causes
- Investigation: Use parallel agents to test hypotheses
- Identification: Determine root cause
- Planning: Create fix and test plan
- Implementation: Apply fix
- Verification: Test until bug is resolved
- Documentation: Update logs
Purpose: Perform a structured security review identifying vulnerabilities, security risks, and compliance issues.
When to use:
- Before releases or deployments
- After major code changes
- Periodic security audits
- When security requirements change
- After security incidents
How to use:
- Specify repository root (or focus areas)
- Workflow scans codebase using 6 parallel agents focused on different security domains
- Findings are scored with severity (S0-S3) and priority (P0-P3)
- Report saved to
plans/(project root) with a dated filename
Example:
User: "Perform a security review focusing on authentication and API endpoints."
Workflow will:
- Scan auth files, API endpoints, and related code
- Identify vulnerabilities (injection, XSS, auth bypass, etc.)
- Score each finding (S0-S3, P0-P3)
- Generate report: plans/security-review-YYMMDD-HHMM-{model}.md
Security Focus Areas:
- Authentication and session management
- Authorization and access control
- Input validation and injection risks
- Sensitive data exposure
- Dependency vulnerabilities
- Cryptographic issues
- Security misconfigurations
- And more (OWASP Top 10 coverage)
Output Format:
- Summary with top P0/P1 security risks
- Findings ordered by priority, then severity
- Each finding includes:
- Vulnerability type and classification
- Attack vector and exploitability
- Security impact assessment
- Suggested fix with security best practices
- Verification steps
Purpose: Systematically identify, fix, and verify security vulnerabilities.
When to use:
- When a security vulnerability is discovered
- After receiving a security review report
- When addressing security advisories
- When fixing reported security issues
How to use:
- Provide security review report or vulnerability description
- Workflow investigates using parallel agents
- Implements fix following security best practices
- Verifies fix and tests for regression
- Updates documentation
Example:
User: "Fix the SQL injection vulnerability identified in the security review
at plans/security-review-260118-1400-claude.md, issue #3."
Workflow will:
- Read security review report
- Investigate the vulnerability
- Implement fix (parameterized queries, input validation)
- Add security tests
- Verify fix works and doesn't break functionality
- Add a `troubleshooting/` entry and update `troubleshooting/index.md`
Process:
- Intake: Read security report, understand vulnerability
- Investigation: Use parallel agents to trace vulnerability
- Root Cause: Identify exact cause and attack vector
- Planning: Create security fix plan with defense-in-depth
- Implementation: Fix with multiple agents (fix, tests, validation)
- Verification: Test fix, check for regression, verify no new issues
- Documentation: Update CHANGELOG and TROUBLESHOOTING
Security Fix Best Practices:
- Strong authentication and authorization
- Input validation and output encoding
- Secure secrets management
- Updated dependencies
- Defense in depth measures
Purpose: Review code and update documentation to match the codebase accurately.
When to use:
- After major code changes
- When documentation seems outdated
- Periodic documentation maintenance
- Before releases
How to use:
- Workflow scans codebase to understand current behavior
- Inventories existing docs and tags issues (P0-P3)
- Fixes in priority order:
- P0: Incorrect docs causing wrong usage
- P1: Missing critical docs
- P2: Reorganization and consolidation
- P3: Diagrams and polish
- Reorganizes
docs/if needed - Adds file maps and diagrams
Example:
User: "Sync documentation after the authentication feature was added."
Workflow will:
- Scan codebase for auth-related code
- Check docs/ for auth documentation
- Identify missing/incorrect docs
- Update or create docs in priority order
- Add file maps if needed
- Cross-link related docs
Priority Buckets:
- P0: Incorrect docs causing wrong usage, broken setup, unsafe behavior
- P1: Missing docs for critical flows (setup, run, build, architecture)
- P2: Reorganization, consolidation, cross-links, reference completeness
- P3: Diagrams, polish, deep examples
Output:
- Organized
docs/directory - Accurate, up-to-date documentation
- File maps for navigation
- Cross-linked related docs
Purpose: Shared standard for scoring issues across all workflows.
When to use:
- Referenced automatically by all review and planning workflows
- Use when manually prioritizing issues
- Use when creating reports or plans
Severity Levels:
- S0 Critical: Security breach, data loss, total outage
- S1 High: Major functionality broken, wide user impact
- S2 Medium: Partial failure, workaround exists
- S3 Low: Minor UX, cosmetic, maintainability
Priority Mapping:
- P0 Blocker: High impact + Likely/Possible → Fix before merge
- P1 Urgent: High impact + Rare, or Medium + Likely → Fix before release
- P2 Soon: Medium + Possible/Rare, or Low + Likely → Fix next sprint
- P3 Backlog: Low + Possible/Rare → Track and defer
Ordering Rule:
- Present items: P0 → P1 → P2 → P3
- Within same priority: S0 → S1 → S2 → S3
Purpose: Quick reference for common workflow terminology and conventions.
When to use:
- Unfamiliar with workflow terminology (P0-P3, S0-S3, multi-repo, etc.)
- Need to understand task marking conventions (
- [✅]vs- [ ]) - Looking up common placeholder meanings
- Understanding workflow category purposes
Covers:
- Priority and severity level definitions
- Workflow category purposes
- Task marking conventions
- Common placeholders
- Agent concepts
Step 1: Planning
→ Use "Implementation Plan" workflow
→ Input: Feature requirements
→ Output: plans/feature-auth-260118-1430-claude.md
Step 2: Review
→ Use "Plan Review" workflow
→ Input: plans/feature-auth-260118-1430-claude.md
→ Output: Feedback appended to plan
Step 3: Refine Plan
→ Use "Implementation Plan" workflow again
→ Input: Original plan + review feedback
→ Output: Updated plan
Step 4: Development
→ Use "Execution" workflow
→ Input: Refined plan
→ Output: Implemented code, updated `docs/CHANGELOG.md` (preferred)
Step 5: Code Review
→ Use "Code Review" workflow
→ Input: Repository root
→ Output: plans/code-review-260118-1600-claude.md
Step 6: Documentation
→ Use "Sync Documentation" workflow
→ Input: Repository root
→ Output: Updated docs/
Step 1: Debug
→ Use "Debug" workflow
→ Input: Bug report, logs, screenshots
→ Output: Fixed code, added a `troubleshooting/` entry
Step 2: Code Review
→ Use "Code Review" workflow
→ Input: Repository root (focus on fix area)
→ Output: Verification that fix is correct
Step 3: Documentation (if needed)
→ Use "Sync Documentation" workflow
→ Input: Repository root
→ Output: Updated docs if bug affected user-facing behavior
Step 1: Code Review
→ Use "Code Review" workflow
→ Input: Repository root
→ Output: plans/code-review-260118-1000-claude.md
Step 2: Planning
→ Use "Implementation Plan" workflow
→ Input: Code review findings
→ Output: plans/tech-debt-260118-1100-claude.md
Step 3: Development
→ Use "Execution" workflow
→ Input: Tech debt plan
→ Output: Refactored code
Step 4: Documentation
→ Use "Sync Documentation" workflow
→ Input: Repository root
→ Output: Updated docs/
Step 1: Security Review
→ Use "Security Review" workflow
→ Input: Repository root
→ Output: plans/security-review-260118-1400-claude.md
Step 2: Security Fix (for critical issues)
→ Use "Security Fix" workflow
→ Input: Security review report, P0/S0 vulnerabilities
→ Output: Fixed code, added a `troubleshooting/` entry
Step 3: Code Review
→ Use "Code Review" workflow
→ Input: Repository root (focus on security fixes)
→ Output: Verification that fixes are correct
Step 4: Documentation (if needed)
→ Use "Sync Documentation" workflow
→ Input: Repository root
→ Output: Updated security documentation
Step 1: Code Optimization Review
→ Use "Code Optimization" workflow
→ Input: Repository root (or specific focus area)
→ Output: plans/code-optimization-260118-1500-claude.md
Step 2: Planning
→ Use "Implementation Plan" workflow
→ Input: Optimization report findings
→ Output: plans/optimization-plan-260118-1530-claude.md
Step 3: Implementation
→ Use "Execution" workflow
→ Input: Optimization plan
→ Output: Optimized code, updated `docs/CHANGELOG.md` (preferred)
Step 4: Code Refactoring Review
→ Use "Code Refactoring" workflow
→ Input: Repository root (or specific focus area)
→ Output: plans/code-refactoring-260118-1600-claude.md
Step 5: Planning
→ Use "Implementation Plan" workflow
→ Input: Refactoring report findings
→ Output: plans/refactoring-plan-260118-1630-claude.md
Step 6: Implementation
→ Use "Execution" workflow
→ Input: Refactoring plan
→ Output: Refactored code, updated `docs/CHANGELOG.md` (preferred)
Step 7: Code Review
→ Use "Code Review" workflow
→ Input: Repository root (focus on optimized/refactored areas)
→ Output: Verification that changes are correct and maintain functionality
- All workflows use P0 → P3 priority ordering
- Focus on P0/P1 items first
- Defer P3 items unless they unblock higher priorities
- Use
npm run buildto verify changes - Test in dev server (
npm run dev) when applicable - Don't skip verification steps
- Update the changelog after each phase (
docs/CHANGELOG.mdpreferred) - Add a
troubleshooting/entry for bug fixes and updatetroubleshooting/index.md - Update the implementation plan after each code build or bug fix: Use the single source of truth for marking, completion markers, and archiving completed plans:
04-documentation/03-mark-completed.md. - Keep documentation in sync with code
All completion marking rules are centralized in:
04-documentation/03-mark-completed.md— ✅ checkboxes, completion markers, and archiving completed plans into the project changelog system.
- Many workflows support parallel agents with a flexible pattern
- Workflows provide suggested agent roles, but you should spawn additional agents as needed
- Adapt agent count and roles based on task complexity and discovered concerns
- Use parallel batch reading (read multiple files concurrently) to maximize speed
- Verify findings directly before acting
- Prefer smallest viable change
- Break large features into phases
- Each phase should have clear exit criteria
- Always include evidence for findings
- Cite file paths and line numbers
- Avoid unverified claims or assumptions
- Avoid over-engineering
- Push speculative refactors to P3
- Focus on concrete, measurable goals
Workflow files use a mixed naming convention that balances clarity and organization:
Numbered Prefixes (e.g., 01-, 02-):
- Used when files have a clear sequence or workflow order
- Example:
01-plan-review.md→02-finalise-plan.md(review before finalizing) - Example:
01-execution.md→02-confirm-execution.md(execute then confirm) - Note: In some directories (like
02-code-build/), numbers indicate workflow sequence or documentation depth - see directory READMEs for clarification
Descriptive Names:
- Used when files are standalone or don't have a clear sequence
- Example:
02-sync-documentation.md,bug-fix-workflow.md - Makes purpose immediately clear from filename
When in doubt:
- Check the directory's README.md for workflow sequence guidance
- Look for "When to Use" sections in workflow files
- Use the main README's decision trees and quick start guides
When workflows generate reports or analysis documents, follow the convention defined in 00-Meta-Workflow/00-meta/naming-conventions.md.
Quick Reference:
- Format:
{report-type}-YYMMDD-HHMM-{model}.md - YYMMDD: 2-digit year, month, day
- HHMM: 24-hour format time
- {model}: AI model name (e.g.,
claude,gpt4,gemini)
Examples:
plans/code-review-260404-1430-claude.mdplans/security-audit-260403-0920-gpt4.md
workflows/
├── README.md (this file)
├── SHARING_AND_SYNC.md (guide for sharing workflows across projects)
├── update-workflows.sh (helper for maintainers to commit/push workflow changes)
├── pull-workflows.sh (helper script for pulling workflow updates)
├── 00-Meta-Workflow/
│ ├── 00-docs/
│ │ ├── CODE-REVIEW-*.md (code review reports)
│ │ ├── implementation-plan-*.md (implementation plans)
│ │ └── old-reviews/ (archived historical reviews)
│ ├── 00-meta/
│ │ ├── README.md (directory index - active vs historical files)
│ │ ├── severity-priority-rubric.md (shared rubric)
│ │ ├── sync-summary-template.md (template)
│ │ ├── parallel-agents-review.md (historical analysis)
│ │ └── filename-review.md (historical analysis)
│ ├── 00-orchestrator/
│ │ ├── README.md (directory index)
│ │ ├── orchestrator-plan-review.md (delegated plan review workflow)
│ │ └── orchestrator-review.sh (shell script for non-interactive reviews)
│ ├── 00-plans/
│ │ └── index.md (active plans index)
│ └── 00-plans-completed/
│ └── index.md (completed plans index)
├── 00-project-setup/
│ ├── README.md (directory index)
│ ├── 01-setup-project.md
│ ├── 02-optimize-workflow-scripts.md
│ ├── 03-sync-workflow-scripts.md
│ ├── 04-track-repos-and-agent-map.md
│ ├── 05-mcp-and-config-setup.md
│ ├── 06-skills-setup.md
│ └── 07-migrate-project-structure.md
├── 01-Planning & Organizing/
│ ├── README.md (directory index)
│ ├── 00-research-and-plan.md
│ ├── 01-plan-review.md
│ └── 02-finalise-plan.md
├── 02-code-build/
│ ├── README.md (directory index)
│ ├── 01-execution.md
│ ├── 02-confirm-execution.md
│ └── 03-execute-and-confirm.md
├── 03-debugging/
│ ├── README.md (directory index)
│ ├── 01-bug-description.md
│ └── 02-bug-fix-workflow.md
├── 04-documentation/
│ ├── README.md (directory index)
│ └── 02-sync-documentation.md
├── 05-review/
│ ├── README.md (directory index)
│ ├── 01-code-review.md
│ ├── 02-code-optimization.md
│ └── 03-code-refactoring.md
├── 06-security/
│ ├── README.md (directory index)
│ ├── 01-security-review.md
│ └── 02-security-fix.md
└── 07-deployment/
├── README.md (deployment guide index with decision tree)
├── ... (deployment guides)
└── 08-API-Integration/
├── README.md (API integration index)
└── ... (integration guides)
Note: Each directory now has a README.md with navigation guidance. Files in 00-meta/ are templates, rubrics, and analysis/review documents about the workflows. Files in 00-docs/ are generated reports, archived reviews, and analysis documents.
If you're unsure which workflow to use:
- Starting new work? → Use Planning workflow
- Reviewing something? → Use Review workflows
- Security audit needed? → Use Security Review workflow
- Writing code? → Use Execution workflow
- Fixing a bug? → Use Bug Fix workflow
- Fixing security issues? → Use Security Fix workflow
- Docs out of date? → Use Documentation workflow
- All workflows reference the shared rubric in
00-Meta-Workflow/00-meta/severity-priority-rubric.mdfor consistent priority and severity scoring.
- Workflows are designed to be used with AI agents that can execute them
- Each workflow is self-contained but designed to work together
- Priority ordering (P0-P3) is consistent across all workflows
- All workflows produce dated outputs in
plans/(project root) or update existing files - Documentation updates go to
docs/and the changelog (docs/CHANGELOG.mdpreferred)
