The eleventh State of Agile survey has just been published by VersionOne. These reports are invaluable in helping agile practitioners understand where their practices, problems and challenges fit the context of the wider world.
As with all VersionOne’s previous reports, the eleventh survey paints a picture of the onward march of agile. Its progress is rarely in a straight line, however, and this latest survey has revealed a very interesting contradiction.Increased Focus on Business Value?
When respondents were asked how the success of agile initiatives were measured, the second ranked answer after “on-time delivery” was “business value.” In the previous report, “business value” was the fourth-ranked answer. This time, some 46 percent of respondents chose it ahead of “customer/user satisfaction” (44 percent), “product quality” (42percent) and “product scope” (40 percent). No other answer was given by more than a quarter of respondents.
When, in the same survey, participants were asked how the success of agile projects (as opposed to initiatives) are being measured, the percentage who answered “business value” fell by half. The implication here would seem to be that the day-to-day metrics collected about the work of agile teams were not primarily focused on business value. In fact, there were eleven other measurements that scored higher than business value. Those measurements were (in ranking order):Velocity (67 percent) Sprint burndown (51 percent) Release burndown (38 percent) Planned vs. actual stories per iteration (37 percent) Burn-up chart (34 percent) Work in Progress (32 percent) Defects into production (30 percent) Customer/user satisfaction (28 percent) Planned vs. actual release dates (26 percent) Cycle time (23 percent) Defects over time (23 percent) Old Muscle Memory
The list of day-to-day metrics being collected shows us the grip that the old muscle memory of waterfall still has on the agile community. The mantra of traditional project management is “Plan the Work, Work the Plan.” It assumes predictability and, consequently, the metrics that are believed to be important are those which show whether there is deviation from the masterplan. Any discrepancies are considered likely to be due to a lack of efficiency.
Velocity, for example, which is right at the top of the list above, tells us nothing about progress or success. Rather, it is a metric which is useful to the development team because it allows them to judge how much work they can pull into a sprint.
Nobody else -- with the possible exception of the product owner, who can use target velocity to estimate release dates -- should be interested in velocity. When managers try to drive up a team’s velocity, it almost always causes the defect rate to peak and the delivery of value to the customer to slow down.
The next four measurements in the list, and that of “planned vs. actual release dates” which comes further down, are all about whether the team is working to plan. Again, these can be very useful to the team itself so that it can decide whether its own plan needs adjustment to achieve a sprint goal or a release goal. Used by anyone else, they just offer opportunities for micromanagement.
“Work in progress” and “cycle time” are useful for measuring the smoothness (or lack thereof) of the development and delivery pipeline, while the “defects into production” and “defects over time” can tell us something about the quality of the product.Progress is in the Product
In short, these can all be useful measurements, but apart from measuring customer/user satisfaction, they are at best secondary when it comes to measuring progress. The value -- and therefore the success of the project -- is in the product and nowhere else. The most crucial factors to measure are the product’s delivery and its impact on the world.
If management, stakeholders or anyone else outside the agile team wants to know the progress being made then all they need do is show up to the sprint review where (in Scrum, at least) they will get to see the latest increment and can suggest what might be done next. Everything else should be left to the team itself.
Please feel free to use the comments section below to tell me if you are surprised (or unsurprised) by the survey’s findings or if there are any additional key factors you use to measure your team’s progress.