Skip to main content

Agile Velocity: The Numbers All Go To 11

Jim Highsmith recently posited that "velocity is killing agility!" which is kind of a fun hypothesis.  Jim observes that company leaders he talks with around the world these days are a little too quick to measure the effectiveness of their agile software development teams by keeping track of the teams' velocity (the average amount of estimated software effort the team delivers per software delivery iteration).
From:  http://startofanadultlife.tumblr.com/post/6847194092/nigel-tufnel-the-numbers-all-go-to-eleven-look
This is quite ironic, of course, since one of the rudimentary things you learn when you first study agile software development is that "velocity" is measured in abstract terms like "points," or "ideal hours."  The numbers are relative to each other, and mapping points to time is a fluid process, and only valid to one team at a time.  The idea of tracking velocity is so absurd, in fact, that there is an entire web site devoted to the concept of estimation using fruit (cherry for very small amounts of effort; watermelon for large amounts).  If each team chooses different themes (fruit, mammals, buildings in Chicago, planets), one can see even more clearly that trying to compare one team to another is a recipe for disaster.

But of course these executives aren't always being quite so blunder-headed with their metrics as to compare one team to another.  Instead, as Jim describes it, they:
  • Try to get teams to deliver faster and faster--if you delivered 5 points in this iteration, try for 6 next time.  Or if your average team velocity was 5 in this project, keep the team together and try to get up to an average of 8 in the next.
  • Evaluate effort solely in terms of the software's value at first release to the market--if you measure your effectiveness by "how quickly you get to feature-complete," you quickly lose track of important things like "how quickly can you change" with the market, "how much does it cost" to maintain the thing, and even "how delighted are customers to use the application."
  • Lose sight altogether of the actual business bottom line.  In real life, software developers are being paid to deliver value to their home businesses, whether measured in increased revenue, decreased cost, increased customer satisfaction, decreased risk of government regulation non-compliance, increased greenness, or anything else in the host organization's balanced scorecard.
These leaders are falling into the classic "it goes to 11" trap made famous by Christopher Guest's character, Nigel Tufnel, in the immortal movie, Spın̈al TapTap aficionados will remember that Nigel, lead guitarist for the band, is very proud of his amplifier, specially produced with control knobs that go to "11," not just "10."  Nigel doesn't even understand the question "but why wouldn't you just get a louder amp?" so pleased is he that his goes to 11.

But what is a leader to do, if she wants to measure the productivity of her IT staff?  You need to figure out who to promote and who to put in the basement without a stapler, after all.  I would recommend the following, taking Jim's points in reverse order:
  • Measure the value generated by software investments.  This is not new--it's Jim's point in his velocity blog post, and in it, he also cites Martin Fowler's 2003 bliki post, "CannotMeasureProductivity," on this exact point.  At the end of the day, a business is in business to create value, not to ensure its developers are working at an abstract maximum capacity.  If Team A's software generated $10 million in value over two years, and Team B's software generated $1 million, and both efforts cost the same, then it would be worth while asking some questions about why one effort was so much more valuable to the business than the other.  You may get a range of interesting answers, some having to do with how well the team is working, and some having to do with the overall value of the concept they were delivering.
  • Evaluate your return on investment over the life of the product, not just one quarter.  In IT just as in other investments, leaders often think too much in the immediate term.  Certainly, it's a good idea to rush to market with software when you can get a first-mover advantage.  Even in this case, however, your IT investment should be made with an eye to the long term.  What will you do after you have jumped into the market with the first offering of this kind?  Have you positioned yourself to maintain your lead, because you have what Jim calls a "supple delivery engine"?  Or have you worn yourself out, and put an idea on the market which can easily be copied by others?  Will a competitor with a robust software delivery methodology quickly leapfrog you and pull ahead?  Unless you're on the brink of insolvency, you need to look at the expected return on investment for the life of the product in the marketplace, not just your profit and loss over this budget year.
  • Balance each software-development specific metric with a complimentary one.  You may have good reasons to measure the amount of software a team is developing.  Perhaps you have concerns about specific individuals whom you feel aren't carrying their weight.  Never say never, I say.  But if you are going to measure velocity, then make sure you measure other things as well, including quality (fit of software to use and actual failures of software to work as intended) and complexity (is the software impossible to change, because it's so poorly written?)  These three metrics balance each other out to some degree, and teams should hit minimum standards in all three dimensions, not just one.  If you ask for only speed, that's all you're going to get, and you won't like it.
Jim makes the proposal that agilists have been too quick to give all of the decision-making power to the Product Owner from the business side, and he suggests remedying this problem by instead creating a team to make decisions with one person from the business and one from IT.  I'm not sure I agree with this solution, since it's extremely powerful to have one person (the person who will live with the results of the decision) calling the shots.  However, I do think that if business people begin to embrace the notion that software quality has actual P&L ramifications for the business, they will naturally want to consult the tech lead about what will create the best business results.
Please read Jim's post--he suggests other really good things, like using value points to measure software value as delivered, rather than focusing on the effort it took to get there.  As always, there is a lot there.

Comments

Popular posts from this blog

How Do You Vote Someone Off of Your Agile Team?

One of the conundrums of agile conversion is that although you are ordered by management to "self-organize," you don't get to pick your own team.  You may not have pictured it this way, but your agile team members are going to be the same people you worked with before, when you were all doing waterfall!   I know I wasn't picturing it that way for my first agile team, so I thought I should warn you.  (I thought I was going to get between six and eight original Agile Manifesto signers.  That didn't happen.). Why "warn" you (as opposed to "reassure" you, say)?  Because the agile process is going to reveal every wart, mole, quirk, goiter, and flatulence issue on the team within a few hours.  In the old days, you could all be eccentric or even unpleasant in your own cube, communicating only by document, wiki, email, and, in extreme situations, by phone.  Now you are suddenly forced to interact in real time, perhaps in person, with written messag

A Corporate Agile 10-point Checklist

I'm pretty sure my few remaining friends in the "small, collocated team agile" community are going to desert me after this, but I actually have a checklist of 10 things to think about if you're a product owner at a big company thinking of trying out some agile today.  Some of these might even apply to you if you're in a smaller place.  So at the risk of inciting an anti-checklist riot (I'm sorry, Pez!), I am putting this out there in case it is helpful to someone else. From http://www.yogawithjohn.com/tag/yoga-class/ Here's what you should think about: 1.        Your staffing pattern.  A full agile project requires that you have the full team engaged for the whole duration of the project at the right ratios.  So as you provision the project, check to see whether you can arrange this staffing pattern.  If not, you will encounter risks because of missing people.  Concretely it means that: a.        You need your user experience people (if a

Your Agile Transformation Needs to Start with a Quiet Phase

From a really great blog post on agile adoption:  http://smoovejazz.wordpress.com/2011/02/16/an-agile-approach-for-adopting-agile-practices/ I've observed some different agile transformation patterns, and maybe you have too: Just Do Standups   (Shoot, then Aim):   some people feel that since you're "agile," you should just start doing stuff, like daily standups, and then build more of the the plan as you go.  Find a team and start doing some agile with them!  Start a revolution one practice at a time, one team at a time. Pros:   you're very busy from the start. Cons:   what exactly are you doing and why? KPI-Driven Change (Aim, then Shoot) : some people who have worked in large corporations for a while will tell you that to get the respect of the people, you need to start with a plan, support the plan with awesome printed and online collateral.  Then you "kick off," tell teams what to do, and measure them using "Key Productivity Indica