Skip to main content

Definition of Done

Clear, measurable criteria that must be met before work is considered complete and ready for release.

Overview

The Definition of Done (DoD) is a shared understanding of what "complete" means for the team. It ensures consistent quality, reduces rework, and creates a clear contract between developers, QA, and product owners.

Purpose

DoD prevents the "90% done syndrome" where work appears complete but lacks critical quality elements.


Universal Definition of Done

These criteria apply to all user stories, regardless of type or complexity.

1. Code Quality

  • Code compiles/builds without errors or warnings

    • Zero compiler warnings
    • Zero linting errors
    • Build pipeline passes
  • Code follows team style guidelines

    • Formatted using Spotless (Java) or Prettier (TypeScript)
    • Passes Sonar quality gate
    • No code smells or technical debt introduced
  • Code is peer-reviewed and approved

    • Minimum 1 reviewer for feature PRs
    • Minimum 2 reviewers for release PRs
    • Code owner approval obtained
    • All review comments addressed or discussed
  • No commented-out code or TODOs without tickets

    • Remove dead code
    • Link TODOs to Jira tickets: // TODO(JIRA-1234): Fix this
    • Remove debug logging

2. Testing

  • Unit tests written and passing

    • New code has unit tests
    • Code coverage ≥85% for new code
    • Edge cases covered
    • Error scenarios tested
  • Mutation testing thresholds met

    • Java: Mutation coverage ≥80% (PITest)
    • TypeScript/JavaScript: Mutation coverage ≥75% (Stryker)
    • Weak tests identified and strengthened
    • Equivalent mutants documented
  • Integration tests written and passing

    • API endpoints tested
    • Database interactions verified
    • External service integrations tested (using TestContainers/WireMock)
    • Happy path and error paths covered
    • Contract tests validate API contracts (Pact or OpenAPI)
  • Existing tests still pass

    • No regressions introduced
    • Flaky tests fixed or removed
    • Test suite runs in <10 minutes
    • Mutation tests included in CI pipeline
  • Manual testing completed

    • Developer has tested locally
    • Happy path verified
    • Error handling verified
    • Edge cases checked

3. Documentation

  • Code is self-documenting

    • Clear variable and function names
    • Complex logic has explanatory comments
    • Public APIs have JavaDoc/TSDoc
    • Purpose and "why" are clear
  • Technical documentation updated

    • README updated if setup changes
    • Architecture diagrams updated if design changes
    • OpenAPI/Swagger spec updated for API changes
    • Postman collection updated
  • User-facing documentation updated

    • User guide updated for new features
    • Release notes drafted
    • Known issues documented

4. Security

  • Security scan passed

    • SAST (Static Analysis) scan clean
    • Dependency vulnerability scan passed
    • No critical or high-severity issues
  • Secure coding practices followed

    • Input validation on all external inputs
    • No hardcoded secrets or credentials
    • Sensitive data not logged
    • Authorization checks in place
    • OWASP Top 10 considerations addressed
  • Security review completed (if applicable)

    • Authentication changes reviewed by security team
    • Payment processing reviewed by security team
    • PII handling reviewed by security team

5. Performance

  • Performance requirements met

    • API response times <500ms (P95)
    • Page load times <2s
    • No N+1 query problems
    • Appropriate indexing for database queries
  • Performance testing completed (if applicable)

    • Load testing for high-traffic endpoints
    • Stress testing for critical paths
    • Memory profiling for resource-intensive operations

6. Deployment

  • Deployable to all environments

    • Works in dev, test, staging environments
    • Configuration externalized (not hardcoded)
    • Environment-specific settings in config files
    • Feature flags implemented if gradual rollout needed
  • Database migrations tested

    • Migration scripts run successfully
    • Migrations are reversible
    • Zero-downtime deployment verified
    • Data integrity maintained
  • Rollback plan documented

    • Steps to rollback clearly defined
    • Rollback tested in lower environment
    • Data migration rollback considered

7. Acceptance Criteria

  • All acceptance criteria met

    • Every AC in the user story satisfied
    • Product Owner can verify functionality
    • Demo-ready
  • Edge cases handled

    • Boundary conditions tested
    • Error states have proper UX
    • Empty states designed
    • Loading states implemented

Feature-Specific Definition of Done

Additional criteria based on the type of work being done.

Backend API Development

  • OpenAPI specification complete

    • OpenAPI 3.x specification created/updated
    • All endpoints documented with descriptions
    • Request/response schemas defined with validation rules
    • Error responses documented (RFC 7807 Problem Details)
    • Examples provided for all operations
    • Rate limiting and authentication documented
    • Tags and operation IDs properly set
  • API contract validation

    • OpenAPI spec validated (swagger-cli validate)
    • Backend validates requests/responses against OpenAPI spec
    • Frontend generates types from OpenAPI spec
    • Contract breaking changes detected by CI pipeline
    • API versioning strategy followed
  • Contract tests implemented

    • Provider-side: API validates against OpenAPI spec (RestAssured + OpenAPI validator)
    • Consumer-side: Generated types used in frontend
    • Pact consumer-driven contracts (if using Pact)
    • Contract tests run in CI pipeline
    • Breaking changes flagged before merge
  • API versioning considered

    • Backwards compatibility maintained OR
    • Version incremented appropriately (semantic versioning)
    • Deprecation warnings added to OpenAPI spec
    • Deprecated endpoints marked with deprecated: true
    • Migration guide provided for breaking changes
  • Logging and monitoring

    • Appropriate log levels used
    • Structured logging implemented (JSON format)
    • Correlation IDs tracked across services
    • Metrics/counters added (request count, latency, errors)
    • Alerts configured for error rates and latency
    • OpenAPI spec published to API documentation portal

Frontend Development

  • Cross-browser testing

    • Tested in Chrome, Firefox, Safari
    • Tested on mobile viewports
    • Responsive design verified
  • Accessibility (a11y)

    • Keyboard navigation works
    • Screen reader compatible
    • Adequate color contrast
    • Alt text for images
    • ARIA labels where appropriate
  • UX/UI review

    • Design approved by designer
    • Matches mockups/wireframes
    • Loading states implemented
    • Error messages are user-friendly
  • Frontend performance

    • Lighthouse score >90
    • Bundle size monitored
    • Lazy loading implemented where appropriate
    • Images optimized

Database Changes

  • Migration scripts reviewed

    • Peer-reviewed by another developer
    • Tested on production-like data volume
    • Backward compatible if possible
  • Database performance verified

    • Query execution plans reviewed
    • Indexes added where necessary
    • No table locks during migration
    • Migration time <5 minutes
  • Data integrity maintained

    • Foreign key constraints respected
    • No orphaned data
    • Data validation rules enforced
    • Audit trail maintained

Mobile Development

  • Platform testing

    • iOS testing on minimum supported version
    • Android testing on minimum supported version
    • Tablet layouts verified
  • App store guidelines

    • Follows Apple App Store guidelines
    • Follows Google Play Store guidelines
    • Privacy policy updated if needed
    • Permissions justified and documented
  • Offline support

    • Graceful degradation when offline
    • Data synchronization on reconnection
    • Offline state clearly communicated

High-Compliance System-Specific DoD

Additional criteria for highly regulated systems or those handling sensitive/critical data.

Critical Transaction Handling

  • Transaction integrity verified

    • ACID properties maintained where required
    • Idempotency implemented for critical operations
    • No possibility of duplicate processing
    • Reconciliation possible
  • Audit logging complete

    • All critical operations logged
    • User actions tracked
    • Immutable audit trail
    • Compliance with audit requirements
  • Abuse/Fraud detection considered

    • Anomaly detection monitoring
    • Rate limiting on sensitive operations
    • Appropriate access controls implemented

Compliance and Regulatory

  • Regulatory compliance verified

    • Industry-specific requirements met (where applicable)
    • Data privacy compliance (GDPR, CCPA, etc.)
    • Security controls followed (SOC2, ISO 27001, etc.)
    • Data retention policies respected
  • Data protection

    • Sensitive data encrypted at rest
    • Sensitive data encrypted in transit
    • Personal data handling compliant
    • Data masking in logs

High Availability

  • Resilience patterns implemented

    • Retry logic with exponential backoff
    • Circuit breakers for external dependencies
    • Timeout configurations appropriate
    • Graceful degradation
  • Disaster recovery tested

    • Backup and restore procedures verified
    • Failover tested
    • RTO/RPO requirements met
    • Data replication working

Definition of Done Checklist

Quick Checklist for Developers

## Development Complete

- [ ] Feature implemented per acceptance criteria
- [GOOD] [ ] Code follows style guidelines ( automated)
- [ ] Self-reviewed the code
- [ ] Commented complex logic

## Testing Complete

- [GOOD] [ ] Unit tests added ( >85% coverage)
- [GOOD] [ ] Mutation tests pass ( >80% Java, >75% TS/JS)
- [ ] Integration tests added
- [ ] Contract tests added (if API changes)
- [ ] Manual testing completed
- [ ] Edge cases tested

## Quality Checks Passed

- [GOOD] [ ] Build passes ( automated)
- [GOOD] [ ] Linting passes ( automated)
- [GOOD] [ ] Sonar quality gate passed ( automated)
- [GOOD] [ ] Security scan passed ( automated)

## Documentation Updated

- [ ] Code comments added
- [ ] README updated (if needed)
- [ ] OpenAPI spec updated (if API changed)
- [ ] API docs published (if API changed)
- [ ] Release notes drafted

## Review and Approval

- [ ] PR created with comprehensive description
- [ ] Evidence provided (screenshots, logs, test results)
- [ ] Peer review completed (1-2 reviewers)
- [ ] All comments addressed

## Deployment Ready

- [ ] Tested in dev/test environments
- [ ] Database migrations tested
- [ ] Rollback plan documented
- [ ] Feature flags configured (if needed)
- [ ] Monitoring/alerts configured

## Acceptance

- [ ] All acceptance criteria met
- [ ] Demo completed (if user-facing)
- [ ] Product Owner accepts

When DoD is Not Met

Options

  1. Complete the work - Extend the sprint if minor items remain
  2. Create follow-up tickets - Only for non-critical items (documentation improvements, minor refactoring)
  3. Move to next sprint - If significant work remains
  4. Reject the story - If fundamental acceptance criteria not met
Never Skip DoD Items

Do not skip DoD items to "make the sprint". Technical debt compounds quickly.

Exceptions

Rare exceptions may be made for:

  • Production incidents requiring hotfixes
  • Critical business deadlines
  • Regulatory deadlines

Process for exceptions:

  1. Document what DoD items are skipped
  2. Create tickets to complete missing items
  3. Get explicit approval from Tech Lead and Product Owner
  4. Complete missing items in next sprint

Evolving the Definition of Done

When to Update DoD

  • New quality issues emerge repeatedly
  • Team adopts new tools or practices
  • Regulatory requirements change
  • Incident retrospectives identify gaps

How to Update DoD

  1. Discuss in team retrospective
  2. Propose changes to the team
  3. Reach consensus
  4. Update this document
  5. Communicate changes clearly
  6. Review effectiveness in 2-3 sprints

DoD Review Cadence

  • Quarterly: Review comprehensiveness
  • After incidents: Add preventive measures
  • New regulations: Update compliance requirements
  • Technology changes: Update tool-specific items

Comparison: Definition of Ready vs. Definition of Done

Definition of Ready (DoR)Definition of Done (DoD)
Before development startsAfter development completes
Ensures story is ready to be worked onEnsures work is ready to be released
Criteria for Product OwnerCriteria for Development Team
User story is clear, estimated, testableFeature is coded, tested, documented, deployable
Prevents starting poorly defined workPrevents incomplete work being released

Both are essential for predictable, high-quality delivery.


Examples

Example 1: Simple Bug Fix

User Story: Fix incorrect tax calculation on payment summary page

DoD Checklist:

  • [GOOD] Bug fixed in TaxCalculator.java
  • [GOOD] Unit test added: TaxCalculatorTest.shouldCalculateTaxCorrectly()
  • [GOOD] Integration test added: PaymentIntegrationTest.shouldShowCorrectTax()
  • [GOOD] Manual testing: Verified on dev environment with test data
  • [GOOD] Code review approved by @teammate
  • [GOOD] No linting errors
  • [GOOD] Sonar quality gate passed
  • [GOOD] Security scan passed
  • [GOOD] Regression testing: Other tax calculations still work
  • [GOOD] Documentation: Updated tax calculation logic comments
  • [GOOD] Audit logging: Tax calculation logged appropriately
  • [GOOD] All acceptance criteria met: Tax now calculates correctly for all tested scenarios

Example 2: New API Endpoint

User Story: As an API consumer, I want to retrieve payment history so that I can display transaction history to users

DoD Checklist:

  • [GOOD] API endpoint implemented: GET /api/v1/payments/history
  • [GOOD] OpenAPI 3.x spec updated with complete endpoint documentation
  • Request parameters documented (page, size, status filter)
  • Response schema defined with validation rules
  • Error responses documented (400, 401, 403, 404, 500)
  • Example requests and responses provided
  • Authentication requirements documented
  • [GOOD] OpenAPI spec validation: swagger-cli validate openapi.yaml passes
  • [GOOD] Request validation: Query parameters validated using Jakarta Bean Validation
  • [GOOD] Response model: PaymentHistoryResponse created and documented
  • [GOOD] Service layer: PaymentHistoryService implemented
  • [GOOD] Repository layer: Pagination and filtering implemented with JPA Specification
  • [GOOD] Unit tests: PaymentHistoryServiceTest (code coverage: 94%, mutation coverage: 83%)
  • [GOOD] Integration tests: PaymentHistoryIntegrationTest with TestContainers
  • [GOOD] Contract tests:
  • Backend validates against OpenAPI spec (RestAssured + OpenAPI validator)
  • Frontend generates TypeScript types from OpenAPI spec
  • Pact consumer contract defined and verified
  • [GOOD] Performance testing: Load tested with 1000 concurrent users (P95: 320ms)
  • [GOOD] Security: Authorization checks ensure users only see their own payments
  • [GOOD] Logging: Structured JSON logs with correlation IDs
  • [GOOD] Metrics: Micrometer counters for request count, latency, errors
  • [GOOD] Error handling: RFC 7807 Problem Details for all error responses
  • [GOOD] Code review: Approved by 2 reviewers (backend lead + team member)
  • [GOOD] Documentation:
  • OpenAPI spec published to API docs portal
  • Postman collection generated from OpenAPI spec
  • Migration guide created (no breaking changes)
  • [GOOD] Demo: Demonstrated to Product Owner with Swagger UI

Example 3: UI Feature

User Story: As a user, I want to see payment status updates in real-time so that I know when my payment completes

DoD Checklist:

  • [GOOD] WebSocket connection implemented
  • [GOOD] Payment status updates pushed to client
  • [GOOD] UI updates without page refresh
  • [GOOD] Loading states shown during payment processing
  • [GOOD] Success/failure states clearly communicated
  • [GOOD] Unit tests: React component tests with React Testing Library
  • [GOOD] Integration tests: WebSocket connection tested
  • [GOOD] Cross-browser testing: Chrome, Firefox, Safari, Edge
  • [GOOD] Mobile responsive: Tested on iOS and Android
  • [GOOD] Accessibility: Keyboard navigation works, screen reader compatible
  • [GOOD] UX review: Designer approved final implementation
  • [GOOD] Error handling: Graceful degradation if WebSocket fails
  • [GOOD] Code review: Approved by frontend team lead
  • [GOOD] Lighthouse score: 92 (performance, accessibility, best practices)
  • [GOOD] Documentation: User guide updated with new feature

Further Reading

SDLC Documentation:

Testing Documentation:

API Documentation:


Summary

Key Takeaways:

  1. DoD ensures consistent quality across all team deliverables
  2. All items must be complete before work is considered done
  3. [GOOD] Automated checks () reduce manual verification burden
  4. Mutation testing (≥80% Java, ≥75% TS/JS) ensures test quality
  5. OpenAPI specifications required for all API changes (schema-first development)
  6. Contract testing validates API contracts between frontend and backend
  7. Banking-specific criteria address compliance and regulatory needs
  8. DoD is a living document that evolves with the team
  9. Never skip DoD items to hit deadlines - technical debt compounds
  10. Use the checklist for every user story to ensure nothing is missed
Critical Systems

For business-critical or highly regulated systems, DoD is essential for compliance, audit requirements, and maintaining trust. Never compromise on DoD items related to:

  • Data integrity
  • Audit logging
  • Security scans
  • Data protection
  • Testing