Engineering, Product management, Leadership

Definition of Done

A set of criteria that must be met before a product or feature is considered complete and ready for delivery.

Relevant metrics: Quality of deliverables, Time to completion, Cost of development, User satisfaction, and Number of defects

Master workshop facilitation with the Workshop Patterns card deck

Get hands-on methods for brainstorming, decision-making, and collaboration.

Reserve your deck!
In this article

What is a Definition of Done?

A Definition of Done (DoD) is a shared agreement that states the exact conditions a product increment must meet to be considered complete. The Scrum Guide (2020) lists the DoD as a commitment for the Increment, making it the quality bar that every backlog item must reach before it can be released or demonstrated.

The term was first coined by Alistair Cockburn in his book Agile Software Development in 2001. The DoD is a way to ensure that all stakeholders have a shared understanding of what is expected from a user story and that all tasks associated with it have been completed. It is also a way to ensure that the user story is of high quality and meets the requirements of the customer

Definition of Done vs Acceptance Criteria

  • Acceptance criteria describe feature‐specific behaviour (e.g., “When the user enters an invalid email, show an error message”).
  • DoD describes global quality requirements (e.g., “Code reviewed, unit tests passing, documentation updated”).

Both are needed; the DoD ensures consistency, while acceptance criteria confirm functional intent.

Why your team needs a shared Definition of Done

When teams hold different mental models of what “done” means, velocity metrics become unreliable and release dates slip. A single, explicit Definition of Done brings engineers, testers, designers, and product stakeholders into alignment and prevents invisible work from leaking into future sprints.

Key benefits of a shared DoD

  1. Transparency — Every backlog item passes the same checkpoints, turning sprint charts and progress demos into truthful indicators of readiness.
  2. Predictability — Because hidden clean‑up tasks (tests, docs, security scans) are baked into each story, velocity stabilises and forecasting improves.
  3. Built‑in quality — Defects, vulnerabilities, and documentation gaps are caught upstream, reducing incident costs and customer churn.
  4. Faster onboarding — New hires grasp expectations quickly by reading the DoD rather than deciphering tribal knowledge.

Over time a well‑adopted DoD becomes both a social contract and a quality gate. If burndown stalls or incident counts spike, inspecting which DoD checkpoints were skipped often reveals the root cause.

Creating a Definition of Done

Crafting a DoD is a collaborative exercise: it must be rigorous enough to protect quality yet lean enough to fit inside a sprint cadence. Set aside a focused workshop so the whole team can debate trade‑offs and agree on realistic guardrails.

Creating a DoD in a workshop setting

A fruitful Definition‑of‑Done workshop balances divergence—surfacing pain points and quality ideas—with convergence—agreeing on a short, testable checklist. Use the guide below to facilitate a 90‑minute session for a team of six to ten people. Adjust timings for larger groups.

Preparation (the day before)

  • Collect sprint metrics: escaped defects, incident count, cycle time, and any notable customer complaints.
  • Ask participants to skim the latest production‑incident post‑mortems.
  • Print or share a digital canvas with three columns: Goals, Pains, Potential DoD items.

Agenda (90 minutes total)

Time Activity Facilitator focus
0‑5 min Welcome & purpose State the objective as the facilitator: to agree on a concise, binary Definition of Done that removes recurring pains.
5‑15 min Align on goals & pains Silent storming: each person adds sticky notes for outcomes (goal) and frustrations (pain). Group and briefly discuss patterns.
15‑30 min Incident walk‑through Review one recent incident where “done” was declared prematurely. Ask: Which checkpoint would have caught this earlier?
30‑45 min Checklist brainstorming In pairs, turn goals & pains into specific, observable DoD statements. Encourage action verbs: “Security scan passes” vs. “Secure.”
45‑60 min Converge & vote Cluster similar items, dot‑vote on impact vs. effort. Target 5–10 high‑impact entries.
60‑80 min Draft & binary test Rewrite each selected item until it is yes/no testable. Example: “90 % unit coverage on new code” is binary; “Code well tested” is not.
80‑90 min Pilot & next steps Choose a near‑term backlog item to pilot the draft DoD. Assign owners to refine ambiguous entries before the next sprint planning.

Facilitator tips

  • Keep a parking lot for off‑topic discussions; return only if time allows.
  • Use a timer to stay on schedule; unfinished items become action items, not meeting overruns.
  • Capture the draft DoD in the team’s tooling (Jira definition, README section) before energy fades.

Running a structured agenda rooted in real pains ensures the resulting DoD is both aspirational and achievable.

What are examples of what goes into a DoD?

Typical examples of what teams put into a Definition of Done:

  • Code reviewed and merged to the main branch.
  • All unit tests and integration tests passing in CI.
  • New code covered by at least 80 % unit tests.
  • No critical SonarQube issues.
  • Documentation updated (README, API docs).
  • Feature flag toggled off by default.
  • Security scan shows no high‑severity vulnerabilities.
  • Product owner signs off in staging.

Although every DoD is context‑specific, these items recur because they protect the fundamentals: code quality, operational safety, and user value. Treat them as a starting point, then add or remove gates to match your domain.

Resist the urge to gold‑plate

Resist the urge to gold‑plate. A bloated DoD slows delivery and encourages bypasses. Review it quarterly; if a checkpoint no longer adds value or can be automated, simplify.

A living DoD adapts to new risks, compliance rules, or tooling. For example, the EU AI Act introduces audit requirements that many teams now add to their DoD (e.g., “Model cards published, bias tests run”). Version each change and communicate it in sprint planning so no one is surprised by a higher bar.

Definition of Done in a Scrum setting

A well‑crafted Definition of Done becomes the quality thread that weaves through every Scrum ceremony. Without it, progress discussions default to vague status talk; with it, each event has a built‑in yardstick for whether work is truly “potentially shippable.”

  • Sprint Planning – The team selects backlog items only if they believe each can meet the DoD within the sprint. If an item seems risky, they split the story or expand their definition of tasks.
  • Daily Scrum – Developers synchronise not just on feature progress but on DoD checkpoints (tests written, security scan queued, docs drafted). This keeps hidden tasks visible.
  • Sprint Review – Stakeholders see increments that already satisfy the DoD, avoiding the “demo surprise” where a feature looks ready but lacks hardening.
  • Sprint Retrospective – The team inspects where DoD items were skipped or proved too burdensome and adapts either the checklist or their engineering practices.

Because the DoD plays a defined role in each event, it acts like an ever‑present safety net: large enough to catch quality lapses, lightweight enough not to tangle the team’s momentum.

Definition of Done examples

Below is a comparative snapshot of how different teams tailor their DoD to domain constraints and workflow styles. Use the table as inspiration rather than prescription; your context will dictate which gates matter most.

Team context Sample DoD snapshot
Web‑application team Code review complete;
80 % unit coverage;
lint errors = 0;
end‑to‑end tests pass in CI;
feature toggled via LaunchDarkly;
docs site updated
Mobile‑application team Build passes on iOS and Android pipelines;
all UI tests green on device farm;
app size increase < 2 %;
store metadata prepared;
crash‑report rate < 0.3 %
Regulated healthcare project Peer review by two engineers;
static analysis: no high issues;
audit log added for new endpoints;
QA signs GxP test script;
compliance officer approval
Regulated fintech team Code review and pair‑program sign‑off;
encryption validated (TLS 1.3, data‑at‑rest AES‑256);
SOC 2 logging enabled;
performance tests meet 100 ms p95 SLA;
risk officer approves release
Kanban flow team Pull requests < 300 lines;
automated tests pass in CI;
cycle time per item < 2 days;
story meets INVEST criteria;
release notes added before deployment

Across these examples, the language stays binary and observable—critical traits that prevent subjective debates at the end of an iteration. Your DoD should evolve similarly: start lean, then add or remove checkpoints as risks, tooling, and compliance demands change.

Frequently Asked Questions

What is DoR and DoD in Agile?

“DoR” stands for Definition of Ready—the checklist that tells a team a backlog item is clear enough to start—while “DoD” is the Definition of Done, the checklist that confirms the item is complete and potentially shippable. Ready guards the input of a sprint; Done guards the output.

Is a Definition of Done mandatory in Scrum?

Yes. The Scrum Guide (2020) lists the Definition of Done as a commitment for the Increment, making it a required element of the framework. Teams may start with a lightweight DoD, but they cannot ignore it entirely without breaking Scrum rules.

What is the Definition of Done in PMP (Project Management Professional)?

In PMP terminology the DoD maps to the Acceptance Criteria defined in the project’s scope-baseline documents. It specifies conditions for deliverable acceptance and quality, ensuring stakeholder sign-off aligns with the project’s quality-management plan.

Who defines “Done” in Agile?

The entire agile team collaborates to define “Done,” but the Product Owner ensures it reflects customer and business needs, while developers and testers define the technical gates that guarantee quality. The Scrum Master (or agile coach) facilitates the conversation and promotes adherence.

What is technical debt in relation to the DoD?

A weak or ignored DoD leads to technical debt because shortcuts slip into production undetected. A strict DoD helps prevent debt by enforcing code review, testing, and documentation for every change.

How does the DoD differ from a Definition of Ready?

The Definition of Ready lists conditions a backlog item must meet before entering a sprint (e.g., user story written, dependencies identified). The Definition of Done lists conditions an item must meet to exit the sprint as potentially shippable.

Who owns the DoD?

The Scrum Team owns the DoD collectively, but the Scrum Master facilitates updates and the Product Owner confirms that it aligns with market and compliance needs.

How often should we review the DoD?

Inspect the DoD in every Sprint Retrospective. Update it when new risks appear or tooling makes a quality gate cheaper.

What tools help enforce the DoD?

Continuous-integration pipelines (GitHub Actions, GitLab CI), code-review checklists, static-analysis tools (SonarQube), and automated deployment gates all enforce specific DoD steps.

Relevant questions to ask
  • What is the purpose of the Agile Definition of Done?
    Hint The purpose of the Agile Definition of Done is to provide a clear set of criteria that must be met before a product or feature can be considered complete.
  • What criteria should be included in the Agile Definition of Done?
    Hint Criteria that should be included in the Agile Definition of Done include: acceptance criteria, quality criteria, and any other criteria that must be met before a product or feature can be considered complete.
  • How will the Agile Definition of Done be used?
    Hint The Agile Definition of Done will be used to ensure that all stakeholders are aware of the criteria that must be met before a product or feature can be considered complete.
  • How will the Agile Definition of Done be communicated to stakeholders?
    Hint The Agile Definition of Done could be communicated to stakeholders through meetings, emails, and other forms of communication.
  • How will the Agile Definition of Done be monitored and enforced?
    Hint The Agile Definition of Done could be monitored and enforced by the Scrum Master or other designated team members.
  • How will the Agile Definition of Done be updated and maintained?
    Hint The Agile Definition of Done will be updated and maintained by the development team through retrospective ceremonies.
  • What are the potential risks associated with using the Agile Definition of Done?
    Hint Potential risks include missing or incomplete criteria, lack of communication, and lack of enforcement.
  • How will the Agile Definition of Done be integrated into the overall project plan?
    Hint The Agile Definition of Done can be integrated into the overall project plan by including it in the project timeline and ensuring that all stakeholders are aware of the criteria that must be met before a product or feature can be considered complete.
People who talk about the topic of Definition of Done on Twitter
Relevant books on the topic of Definition of Done
  • Lean Software Development: An Agile Toolkit by Mary Poppendieck and Tom Poppendieck (2003)
  • Succeeding with Agile: Software Development Using Scrum by Mike Cohn (2009)
  • Kanban: Successful Evolutionary Change for Your Technology Business by David J. Anderson (2010)
  • Scaling Lean & Agile Development: Thinking and Organizational Tools for LargeScale Scrum by Craig Larman and Bas Vodde (2009)
  • Scrum and XP from the Trenches by Henrik Kniberg and Mattias Skarin (2011)
Sources

Want to learn more?

Receive a hand picked list of the best reads on building products that matter every week. Curated by Anders Toxboe. Published every Tuesday.

No spam! Unsubscribe with a single click at any time.

Community events
Product Loop

Product Loop provides an opportunity for Product professionals and their peers to exchange ideas and experiences about Product Design, Development and Management, Business Modelling, Metrics, User Experience and all the other things that get us excited.

Join our community

Made with in Copenhagen, Denmark

Want to learn more about about good product development, then browse our product playbooks.