Back to glossary

Definition of Done

A shared checklist of activities that must be completed before any work item is considered finished. The Definition of Done ensures consistent quality across the team by making quality standards explicit and non-negotiable.

The Definition of Done applies to every work item the team delivers, unlike acceptance criteria which are specific to individual stories. Common items include code review completed, unit tests passing, documentation updated, and deployment to staging verified. The DoD evolves over time as the team identifies recurring quality issues and adds preventive checks.

For AI product teams, the Definition of Done should include AI-specific quality gates: model evaluation against a holdout test set, bias and fairness checks, latency benchmarks met, fallback behavior tested, and monitoring dashboards configured. Growth teams should advocate for including analytics verification in the DoD, confirming that events fire correctly and dashboards reflect the new feature before it is marked complete. Without these checks in the DoD, teams frequently ship AI features that work functionally but lack the observability needed to evaluate their impact and iterate. A robust DoD prevents the accumulation of invisible debt that makes future optimization harder.

Related Terms