Definition of Done (DoD)
Introduction
The Definition of Done (DoD) establishes the criteria that must be satisfied before any work item is considered complete. It serves as a shared checklist that ensures consistent quality standards across all deliverables and creates a common understanding among team members about what "done" truly means.
Purpose
The purpose of this Definition of Done is to:
- Ensure all work items meet the same high-quality standards before being released
- Provide clear completion criteria for developers, testers, and stakeholders
- Minimize technical debt and prevent quality issues from accumulating
- Create transparency and consistency in the development process
- Support continuous improvement by establishing measurable quality gates
Definition of Done Criteria
For a work item to be considered "Done," it must meet all the following criteria based on the type of work:
1. Code Quality
- All code follows established coding standards and best practices
- Code is clean, maintainable, and follows SOLID, DRY, and KISS principles
- All code is properly commented and documented where necessary
- No commented-out code or debug statements remain in the codebase
- Code uses appropriate design patterns consistent with the architecture
2. Testing
- All acceptance criteria have been verified and fulfilled
- Unit tests are written for all new code with appropriate coverage
- Integration tests validate component interactions where applicable
- End-to-end tests cover critical user flows
- Manual feature testing has been completed successfully
- Exploratory testing has been performed where appropriate
- No high or critical bugs remain unresolved
- For any bugs discovered after issue closure, the root cause must be covered by tests at the lowest appropriate level (unit tests preferred) to prevent regression
3. Code Review
- Code has been reviewed by at least one other developer
- All review comments and suggestions have been addressed
- The pull request includes appropriate documentation and context
- The pull request description explains the purpose, approach, and testing methodology
4. Documentation
- API documentation is updated for any new or modified endpoints
- User documentation is created or updated if user-facing features change
- Architecture decisions are documented if the implementation introduces significant changes
- Data models and database schema documentation is updated if the implementation requires changes
5. Technical Requirements
- The implementation is secure and follows security best practices
- Performance has been considered and tested where applicable
- The code is accessible and follows accessibility standards for user interfaces
- The feature is compatible with supported browsers and devices
- The implementation handles errors gracefully with appropriate logging
6. DevOps
- Code builds successfully in the CI pipeline
- All automated tests pass in the CI environment
- Feature is deployed and verified in a staging/test environment
- No new linting errors or warnings are introduced
- Database migrations (if any) are tested and verified
7. Product Verification
- Product Owner/Manager has reviewed and approved the implementation
- The feature fulfills all business requirements and user stories
- The solution works in all required scenarios and edge cases
- The implementation is consistent with the product design and user experience
Feature-Specific Definition of Done
Based on the QA Test Strategy, the following additional criteria apply to specific feature types:
For UI Features
- Components render correctly across all supported screen sizes
- UI is consistent with design specifications and style guidelines
- User interactions (clicks, forms, navigation) work as expected
- Loading states, error states, and empty states are handled appropriately
- Animations and transitions are smooth and performant
For API Features
- API endpoints return correct status codes and response formats
- Authentication and authorization controls are properly implemented
- API documentation is updated with new endpoints, parameters, and responses
- API performance is within acceptable limits
- All API endpoints are tested with appropriate test cases
For Database Changes
- Data migrations have rollback plans if applicable
- Database schema changes are documented
- Indexes are created for performance where needed
- For data flows that require transactional boundaries, leverage transactions
- Data integrity constraints are maintained
Verification Process
During the development and QA processes, the team will verify each work item against this Definition of Done. The verification process includes:
- Developer Self-Check: Developers verify their work against the DoD before submitting for review
- Code Review Check: Reviewers ensure the work meets all DoD criteria
- QA Verification: Quality team verifies all applicable testing criteria are met
- Product Verification: Product team confirms business requirements are satisfied
Any work that does not meet all applicable DoD criteria should not be accepted as complete and will require additional work before it can be merged and deployed.
Example of Complete Work
Here's an example that demonstrates properly completed work that meets the Definition of Done criteria:
Feature: Express Links Creator for Booking Flow
Completion Evidence:
-
Code Quality: Following SOLID principles, the feature was implemented with clean separation of concerns between UI components and business logic using the Component/Hook Pattern.
-
Testing:
- Unit tests cover all critical functions with 85% coverage
- E2E tests verify the complete flow from creation to usage
- Manual testing performed on Chrome, Firefox, and Safari
- No critical or high-priority bugs remain
- After the previous closure of this issue, unit tests were added to prevent future regressions
-
Code Review: Pull request #123 was reviewed by two team members with all feedback addressed
-
Documentation:
- API endpoints documented in Swagger
- Component usage examples added to Storybook
- User guide updated with new feature instructions
-
Technical Requirements:
- Security review completed with no issues
- Functional requirements met
- Performance requirements met
- Accessibility requirements met
-
DevOps:
- Deployment verified on staging environment
- CI pipeline passes all checks
- Feature flag implemented for controlled rollout
-
Product Verification:
- Product Manager approved implementation
- Feature demo conducted with stakeholders
- All acceptance criteria met and verified
Continuous Improvement
The Definition of Done is a living document that will evolve as our processes mature and as we identify opportunities for improvement. Team members are encouraged to suggest additions or modifications to the DoD during retrospectives or whenever process gaps are identified.