6 Feb 2008

An Agile Iteration

(incorporates updates from feedback from Gary, Phlip, and Steven on the the XP Yahoo Forum)

7 years after the Agile Manifesto, it seems there is still a lot of confusion about Agile approaches to software development. I’ve observed that many teams are unused to the rigour and discipline that these approaches require. They don’t apply that rigour and then find themselves in difficulty.
Requirements on cards? No documentation? No design?...not quite.
Here is a flow for a basic Agile iteration...

Click to enlarge

An Iteration Begins with agreeing what is to be done
The Whole Team (developers, customers and other stakeholders) agree the Users Stories to be delivered in this iteration. The team may have a good idea of what these stories are going to be Stories are agreed from the Release Plan.

The stories at this point are just simple business focused features written on index cards. These stories may be around 2-4 days of worth of work, small enough to estimate, large enough to be interesting to the customer.

The Customer orders the stories by priority (i.e. what to do next). Developers estimate (roughly) the prioritised stories. The team selects only as many stories as could reasonably be completed during the iteration.

A User Story is Card/Conversation/Confirmation + Implementation
Each day the Whole Team get together and briefly discuss what was achieved the previous day, today’s goals, and with any problems. As the Iteration progresses the Developers and Customers have conversations to elaborate the detail of the story they are working on. The common misunderstanding is that the index card is the requirements document. That's crazy! - It’s just a marker. The both Customer and Developer confirm they have a shared understanding of the user story through writing customer tests. These customer tests document the system and act as an executable specification.

Developers may break stories down into smaller development tasks. They practice "Test Driven Development" to simultaneously deliver both Developer Tests (e.g. JUnit tests) and Production Code. Ongoing design is essential to keep both the design simple yet incorporate each new feature. This commitment to design is an in-built part of "Test Driven Development" practice.

The “Developer Tests” produced by TDD provide a comprehensive regression test. This allows the team to move quickly and confidently, safe in the knowledge that their changes haven’t broken anything. Code is integrated into a potentially shippable build every few hours.

The teams themselves decide how they can most productively deliver software. They may pair up to produce optimum designs, or they may not.
The important thing is that the team are professionals and can decide for themselves their most appropriate approach.

Each Iteration produces production ready code
As tasks are completed, other unforeseen tasks may also come up, these are just added to the outstanding tasks and are completed as part of the story. Sometimes issues come up that need clarification from the Customer. The ongoing Customer-Developer conversations are central to the shared understanding of the stories.

As stories are completed the Customer Tests are executed. The story is not "Done" until:
- The Customer Tests for the story all pass.
- The Customer is happy that those tests cover their usage scenarios.
- The Developer is happy that the code is Good Code.

The idea here is that all work is done to a production ready quality. It is better to have fully releasable features than a larger number of “not quite production ready” features.

An Iteration finishes with a look back at what has been achieved
If there is time in the iteration to complete another Story then they complete another story. Any stories not completed at the end of the Iteration are ignored. So where there is not time to finish another story before the iteration finishes, time may be better spent addressing any Technical Debt the project is carrying.

At the end of the iteration Developers and Customers get together and demo the implemented User Stories. They discuss what went well, what could have been done better, and what they would like to do differently in the next iteration. The volume of work achieved in this iteration establishes what can reasonably be completed in the next iteration.