Skip to main content

Sprint Review (Sprint Demo)

Sprint Review demonstrates completed work to stakeholders, gathers feedback, and validates that delivered features meet expectations.

Overview

Sprint Review is a collaborative ceremony where the team demonstrates completed stories, collects stakeholder feedback, and validates alignment with business goals. The ceremony serves as a critical feedback loop between engineering teams and business stakeholders, preventing the common failure mode of teams building technically sound solutions that miss business needs.

Sprint reviews embody the agile principle of "working software over comprehensive documentation." Rather than status reports or PowerPoint presentations, reviews show actual functioning features. This creates an empirical basis for feedback - stakeholders see and interact with the software, discovering issues that would never surface through specification documents alone.

The ceremony also creates accountability. Public demonstration of sprint outcomes pressures teams to meet Definition of Done and deliver production-ready features. Teams cannot hide technical debt or quality shortcuts when demonstrating to stakeholders.

Demo Success

A successful demo shows working software solving real business problems. Focus on the user experience and business value, not technical implementation details. If you're explaining architecture diagrams or showing code during a demo, you've lost the narrative. Stakeholders care about capabilities and outcomes, not implementation mechanisms.


Core Principles

  • Working Software: Demo actual working features, not slides or mockups
  • User-Focused: Show features from user perspective
  • Stakeholder Engagement: Collect actionable feedback
  • Transparency: Show what was completed, not what's in progress
  • Evidence-Based: Provide test results, metrics, and proof of quality

Sprint Review Structure

Duration

  • 2-week sprint: 1 hour
  • 1-week sprint: 30 minutes

Participants

  • Development Team (required)
  • Product Owner (required)
  • Stakeholders (business, management, other teams)
  • Scrum Master (facilitator)

Agenda (60 minutes)


Part 1: Sprint Overview (5 minutes)

Sprint Context

Product Owner Presents:

  • Sprint goal and how it was achieved
  • Stories planned vs completed
  • Any scope changes during sprint
  • Sprint metrics (velocity, burndown)

Example:

Sprint Goal: "Enable customers to schedule future payments"

Planned: 25 story points across 8 stories
Completed: 23 story points across 7 stories
Incomplete: 1 story (2 points) moved to next sprint due to dependency

Velocity: 23 points (target: 25)
Burndown: Consistent progress throughout sprint

Part 2: Demo Completed Stories (35 minutes)

Demo Preparation Checklist

Before Sprint Review:

  • Demo environment stable and tested
  • Test data loaded (realistic scenarios)
  • Demo script prepared
  • Screenshots/videos as backup
  • Known issues documented
  • Feature toggles enabled for demo features
  • Network/infrastructure verified

Demo Guidelines

What to Demo:

  • Completed stories (Definition of Done met)
  • Working features in demo environment
  • End-to-end user workflows
  • Integration with other systems
  • Mobile and web versions (if applicable)

What NOT to Demo:

  • Work in progress
  • Code or technical implementation
  • Unit tests or technical details
  • Stories not meeting Definition of Done

Demo Format

For Each Story:

  1. Business Context (30 seconds)

    • User story recap
    • Business value
    • User persona
  2. Demonstration (3-5 minutes)

    • Show actual working feature
    • Follow realistic user workflow
    • Highlight key functionality
    • Show error handling
  3. Evidence (1 minute)

    • Test coverage metrics
    • Performance metrics
    • Security scan results
    • Accessibility compliance
  4. Q&A (1-2 minutes)

    • Stakeholder questions
    • Clarifications
    • Feedback capture

Demo Script Example

Story: Schedule Future Payment

1. Business Context (30 sec):
"As a customer, I want to schedule payments for a future date so I can
plan my finances in advance. This feature was requested by 45% of
customers in our last survey."

2. Demonstration (4 min):
[Screen share]
- Log in as customer
- Navigate to "Make Payment"
- Enter payment details
- Select "Schedule for later"
- Choose future date from calendar
- Review scheduled payment summary
- Submit and receive confirmation
- Show scheduled payment in payment history
- Demo email confirmation received

3. Evidence (1 min):
- Unit test coverage: 95%
- Integration tests: 12 passing
- Manual testing: All scenarios passed
- Performance: Payment scheduling <500ms
- Accessibility: WCAG 2.1 AA compliant

4. Q&A (2 min):
- Answer stakeholder questions
- Capture feedback

Part 3: Evidence Collection (10 minutes)

Required Evidence

Evidence distinguishes professional software delivery from amateur demos. Stakeholders trust teams that provide concrete proof of quality, not just working features. Evidence also protects the team - when stakeholders question quality or completeness, documented evidence provides objective answers.

The evidence presented should be proportional to risk. High-stakes features (payment processing, security, data privacy) require comprehensive evidence. Low-risk features (UI tweaks, content updates) need less. Teams must calibrate evidence collection to avoid bureaucracy while maintaining necessary rigor.

Testing Evidence:

✓ Unit Tests: 95% coverage (target: >85%)
✓ Integration Tests: All passing (18/18)
✓ Contract Tests: Provider contract validated
✓ E2E Tests: Critical paths passing (5/5)
✓ Mutation Tests: 82% mutation coverage (target: >80%)
✓ Manual Testing: All acceptance criteria verified

Quality Evidence:

✓ Code Review: All PRs approved by 2+ reviewers
✓ Static Analysis: No critical issues (SonarQube)
✓ Security Scan: No high/critical vulnerabilities
✓ Performance: All endpoints <500ms (p95)
✓ Accessibility: WCAG 2.1 AA compliance

Compliance Evidence:

✓ Audit Logging: All actions logged with user context and timestamp
✓ Data Encryption: Sensitive data encrypted at rest and in transit
✓ Data Masking: Sensitive information properly masked in logs and UI
✓ Error Handling: No sensitive data exposed in error messages
✓ Rate Limiting: Implemented and tested to prevent abuse

These compliance checks ensure the software meets regulatory and security requirements beyond functional correctness. In regulated domains, features aren't "Done" until compliance evidence exists - functional correctness is necessary but insufficient.

Evidence Presentation

Dashboard/Report:

  • GitLab pipeline results (all green)
  • Test coverage trends
  • Performance metrics
  • Security scan results
  • Accessibility audit

Screenshots:

  • Passing test suites
  • Code coverage reports
  • Performance metrics
  • Security scan results

Part 4: Stakeholder Feedback (15 minutes)

Collecting Feedback

Open Discussion:

  • What works well?
  • What could be improved?
  • Missing functionality?
  • Usability concerns?
  • Performance feedback?

Capture Systematically:

FeedbackTypePriorityAction
"Can we add bulk scheduling?"EnhancementHighCreate story for backlog
"Loading time feels slow"IssueMediumInvestigate performance
"Confirmation email unclear"ImprovementLowRefine email template

Feedback Categories:

  • Bugs: Issues with current functionality
  • Enhancements: New features or improvements
  • Questions: Clarifications needed
  • Concerns: Risks or issues to address

Acceptance Decisions

For Each Story:

  • Product Owner accepts or rejects
  • If rejected: document reasons and next steps
  • If accepted: move to Done column

Acceptance Criteria:

  • All Definition of Done items met
  • Stakeholder feedback addressed or backlogged
  • No critical bugs or issues
  • Quality evidence provided

Part 5: Next Sprint Preview (5 minutes)

Product Owner Shares:

  • Upcoming sprint goal
  • Top priority stories for next sprint
  • Major initiatives or changes
  • Stakeholder involvement needed

Example:

Next Sprint Goal: "Improve payment error handling and user feedback"

Top Priorities:
1. Better error messages for payment failures
2. Real-time payment status updates
3. Retry mechanism for failed payments

Stakeholder Input Needed:
- User research on error message clarity
- Business rules for retry attempts

Remote Demo Best Practices

Technical Setup

Video Conferencing:

  • Test screen sharing beforehand
  • Use high-quality screen resolution
  • Close unnecessary applications
  • Mute notifications

Demo Environment:

  • Stable internet connection
  • Dedicated demo environment (not local)
  • Backup recordings/screenshots
  • Test demo flow beforehand

Engagement Techniques

Interactive Elements:

  • Polls for feedback ("Does this solve your problem?")
  • Q&A in chat (monitor continuously)
  • Reaction emojis for quick feedback
  • Breakout rooms for detailed discussions

Recording:

  • Record demo for absent stakeholders
  • Share recording after meeting
  • Create highlight clips for key features

Common Anti-Patterns

Demoing Incomplete Work

Problem: Showing work in progress that doesn't meet Definition of Done Solution: Only demo truly complete stories; be transparent about incomplete work

Technical Deep Dive

Problem: Explaining code, architecture, database schema during demo Solution: Focus on user experience and business value; defer technical discussions

No Actual Demo

Problem: PowerPoint presentation instead of working software Solution: Always show actual working features in demo environment

Ignoring Feedback

Problem: Collecting feedback but never acting on it Solution: Document feedback, prioritize with Product Owner, create backlog items

Cherry-Picking Success

Problem: Only showing perfect scenarios, hiding problems Solution: Be transparent about issues, demonstrate error handling


Sprint Review Checklist

Before Sprint Review (1 day before)

  • Demo environment stable and tested
  • Test data loaded with realistic scenarios
  • Demo script prepared for each story
  • Evidence collected (test reports, metrics)
  • Screenshots/videos as backup
  • Stakeholders invited and confirmed
  • Known issues documented
  • Dry run completed

During Sprint Review

  • Sprint overview presented
  • All completed stories demoed
  • Evidence shown for each story
  • Stakeholder feedback captured
  • Product Owner accepts/rejects stories
  • Next sprint preview shared
  • Action items documented

After Sprint Review

  • Feedback items added to backlog
  • Accepted stories moved to Done
  • Rejected stories updated with feedback
  • Recording shared with absent stakeholders
  • Action items assigned
  • Stakeholders thanked for participation

Evidence Template

Story Evidence Report

## Story: Schedule Future Payments (PAYMENT-125)

### Completion Status
✓ All acceptance criteria met
✓ Definition of Done satisfied
✓ Product Owner reviewed

### Testing Evidence
- Unit Tests: 95% coverage (45/47 tests passing)
- Integration Tests: 12/12 passing
- E2E Tests: 5/5 critical paths passing
- Manual Testing: All scenarios verified

### Quality Metrics
- Code Review: 2 approvals received
- SonarQube: Quality Gate passed (A rating)
- Security Scan: 0 high/critical vulnerabilities
- Performance: Avg response time 245ms (p95: 380ms)

### Compliance Evidence
- Audit logging: All payment schedules logged
- Data encryption: Sensitive data encrypted at rest
- Error handling: No sensitive data in error messages
- Rate limiting: 100 req/min per user

### Demo Notes
- Demoed on 2025-01-28 Sprint Review
- Stakeholder feedback: Positive, requested bulk scheduling
- Product Owner: Accepted
- Known Issues: None

Further Reading


Summary

Key Takeaways:

  1. Working Software: Demo actual working features in demo environment
  2. User-Focused: Show features from user perspective, not technical details
  3. Evidence Required: Provide test results, metrics, and proof of quality
  4. Stakeholder Feedback: Actively collect and document actionable feedback
  5. Acceptance Decisions: Product Owner explicitly accepts or rejects each story
  6. Preparation Critical: Demo environment tested, script prepared, evidence collected
  7. Transparency: Show what's complete, be honest about incomplete work
  8. Action Items: Convert feedback into backlog items with priorities