Metric drives behavior
Any metric you can buy your way out of…
Is probably not a useful metric to measure yourself by.
If it’s important and you can spend money to fix it, by all means, go do that.
But the helpful metrics are the ones where cash isn’t the solution.
—Seth Godin
When starting out with metrics and KPI’s, consider…
- If this went up what would we do differently?
- If this went down, what would we do differently?
- Given your answers why do you need the metric?
Metrics for team coaching. Always be thinking:
- If we improve this, what might we degrade?
- Improvement takes time, how will we know when impact begins.
- How much good for this metric is enough or too much, what signs do we look for?
- —Troy Magennis
The first things I'd check to diagnose bad quality:
- how much work in progress
- what % of the tasks have deadlines
- how big the tasks are; and
- are there any hand-offs.
From my experience, nothing hurts quality more than those four. —Dimitri Kraivanov
5 Basic Metric Rules:
- Respect individual safety.
- Show trends.
- Compare data in context.
- Highlight the unusual.
- Balance the focus—the four balancing pillars are usually:
- Quality
- Productivity
- Predictability, and
- Responsiveness.
(@t_magennis @elabor8)
Metrics are not a yard-stick to control and manage people but serve to bring transparency to the teams.
Ryanair to cancel 40-50 flights per day for six weeks to meet their “punctuality” metric, rather than focusing on customer’s fitness for purpose criterion on-time arrival metric. Ryanair noticed that punctuality—a vanity metric—had fallen below 80% and that cancelling less than 2% of its flights—affecting up to 285,000 passengers—would help it hit its annual punctuality target of 90%.
To do: Add quotes on behavior and control from don’t just do something, stand there!
- The purpose of measurements is to motivate the parts to do what is good for the organization as a whole.
- Good metrics are enlightening and help us make better decisions.
- The most powerful statement managers can make about what is important in the organisation lies in what they choose to measure.
- It is difficult to detect improvements or reductions in productivity because we feel busy regardless of our output. That’s why measurement is important.
- Measurements aren't about achieving certainty. They are about reducing uncertainty
- The purpose of measurements is to motivate the parts to do what is good for the organization as a whole.
- Every metric is a vanity metric if you are not using it to drive behavior.
- Metrics are people, too.
- It’s better to be roughly right than precisely wrong
- Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.
- What gets measured is what gets done.
- You can game any metric.
- Don't measure anything unless the data helps you make a better decision or change your actions.
- If you're not prepared to change your diet or your workouts, don't get on the scale.
- —Seth Godin, via
- Even noisy data is better than no data.
- Data always trumps opinion.
- Without data, assumptions fill the void.
- When benefits are not quantified at all, assume there aren’t any.
Seth Godin » Analytics without action
Meten leidt tot weten(?!). To measure leads to knowledge.
From The Lean Startup—Eric Ries: Actionable Metrics are
- Actionable
- Accessible
- Auditable
Use behaviour-driven metrics—metrics that deal with people and their actions with the system, e.g.
- downloading the product;
- login into the product;
- posting and sharing a picture; and
- commenting on a picture.
Sounds like user stories.
Use actionable behavioural metrics one-pager to settle any product arguments throughout the organization.
METRICS:
- Measure Everything That Results In Customer Satisfaction
Have a look at pirate metrics.
In other words, metrics drive behavior, possibly related to the Observer Effect.
Therefore:
- Secure that all metrics lead to customer delight and value creation, potentially boosting the net promoter score, either, directly or indirectly.
- Pick and design your metrics with great care as they drive human behavior, and therefore that of the organization.
- Use metrics to guide improvements that accelerate the organization or ecosystem as a whole, not to measure activities of people. This is, pick metrics that optimize the whole on multiple levels of scale or granularity, e.g. epic, feature, and story level.
- Limit metrics to numbers that quantify a certain outcome or the quality of certain input that is key for the quality of an outcome.
- Select two to three universal metrics that apply to the whole unit, division or all teams.
- Use ockham's razor to measure only the necessary things.
- Aim the metric to maximize value creation.
- Capture the metric in planguage to quantify quality.
- Keep your metric elegant, terse, to the point.
The worst thing that van happen with metrics is that people start feeling beat up, which means that they will start hiding data. If the management does not understand the constraints because the teams concerns about getting beat up for not being on time, then you never know where to help or how move resources around. If teams see management as helping to move resources around and provide creative and helpful changes, the issues will be made visible early, and you can respond and adjust in reaI·time. The leads to a positive reinforcing cycle that is key to the the agile management approach. (Source: A Practical Approach to Large-Scale Agile Development: How HP Transformed LaserJet FutureSmart Firmware (Agile Software Development Series))
- Metrics without goals are naked—Metrics should always give you a clue on where you are regarding your goals. So, find out where you are, how you measure that, and where you want to be. Set up a metric that tracks your progress.
- Balanced Metrics—Many metrics only focus on operational excellence. This creates a biased view on reality and distorts and deforms the organization as a whole. Therefore, balance metrics across the following dimensions:
- Operational Excellence
- Velocity
- Burn rate
- Predictability
- Sustainability
- Meeting deadlines
- Stay within budget
- Meet quality requirements
- Cohesive set user stories in a sprint
- User Orientation
- Availability
- Performance
- User satisfaction (net promoter score (NPS))
- Business Value
- Business value per € development
- Business value realization
- Future Orientation
- Enthusiasm and motivation
- happiness index
- Educational opportunities
- Vision on future development
- Innovative governance
- Have participants brainstorm on metrics and put each of them in the most appropriate category. Next, pick one or two from each catagory to create balance.
- Consider using planguage to capture metrics in a solid, comprehensive and consistent way.
- Operational Excellence
Vanity Metrics
The book The Lean Startup—Eric Ries defines so called vanity metrics.
Useful Metrics
Donald Reinertsen, author of Managing the Design Factory states that a good, useful metric is:
- Simple—The ideal metrics are self-generating in the sense that they are created without extra effort in the normal course of business.
- Relevant—One test of relevance is whether the metrics focus on things that are actually controllable by the people being measured. Psychologists have found that when people think that they can control something they are more motivated to control it. Measuring people on things they can not control simply causes stress, dissatisfaction, and alienation. Also, metrics must be relevant towards the end goal.
- Leading—Managers like leading indicators, and prefer an imperfect forecast of the future to a perfect report on the past. It is better to measure the size of a test queue than it is to measure the processing times of individual tests because test queue size is a leading indicator of future delays in test processing. Accountants like lagging indicators that can be measured very accurately, but these point to things that have past.
- Self-generating—Metrics are created without extra effort in the normal course of business, as in spin-off of daily activities.
All metrics can be leading or lagging relative to some other metric. The important question is whether you can describe a coherent relationship between them.
KPIs
David J. Anderson on KPIs: Measure what matters to customers! All KPIs should be recognizable to your customer! Fit for purpose service delivery KPIs:
- lead time;
- quality;
- predictability;
- conformance;
- safety.
KPIs:
- help to give direction
- are meaningful for everyone
- focus on trends rather than absolute numbers
- focus on quality (speed will follow)
- drive improvements in the way of working
- challenge everyone to improve performance
- are tied to a strategic objective
- contribute to vision and mission
- provide neutral and objective information, not judgement
- flow top down and bottom up
- are concrete but not a target
- have at least one bound
- are leading indicators
- gives insight in and answers the question “Are we heading in the right direction?” at strategic (company), tactical (unit) and operational (team) level;
- are reflected in the various flow state admission criteria (e.g. Definition of Ready, Definition of Done).
Good KPIs are:
- accessible
- transparent—visibile to everyone
- simple
- understandable
- actionable at the lowest organizational levels (team, individual)
KPIs DO NOTs:
- Do not use KPIs to compare teams or units;
- Do not use or design KPIs to judge;
- Do not create KPIs that threaten or scare people;
Potential KPI trends you may want to track:
- ka-ching moments—every time an item is put in production a point is scored; higher is better; trend should be upwards as team speeds up, gets gelled;
- average lead time distribution:
- average time between moment item is pulled into team and ka-ching moment; shorter is better; trend should be downwards;
- number of outliers (should decrease)
- throughput—running average of completed items per time period (week, month, quarter, year); as team speeds up, trend should follow upwards;
- happiness index—drives speed improvements; higher is better; trend should be upwards;
- due date performance:
- For the most recent month and for the year to date;
- Optional year-on-year (or 12 months ago);
- flow efficiency:
- sum of work time divided by sum of waiting time for all items
- good indicator of the waste in the system;
- defect rate (a.k.a. bugs):
- Defects represent opportunity cost and affect the lead time and throughput of the system.
- Report the number of escaped defects as a percentage against the total WIP and throughput.
- Keeping the number of bugs between 0 and 20 is a good policy for most projects.
- Key questions:
- Why is the number of new defects increasing? Did you relax some QA policies?
- How did the high level of bugs in week 20 affect cycle time?
- What was the impact on the cumulative flow diagram when the number of bugs increased?
- Over time, work to make the defect rate fall to close to zero.
- less is better;
- trend should be downwards;
- blocked items:
- Blocked items have serious long term effects on the systems.
- A team’s ability to quickly solve issues says a lot about the team’s performance and effectiveness.
- Blocked items should always be visible on the board.
- Tracking the status over time is usually a good way of knowing whether the team is moving in the right direction.
- less blockers is better; downward trend;
- failure load:
- Failure Load (amount of rework) is a good indicator that you are improving as a whole organization and thinking at a system level.
- Failure load tracks how many work items you process because of earlier poor quality—how many work items are production defects or new features that have been requested through your customer-service organization because of poor usability or a failure to anticipate user needs properly.
- Ideally, Failure Load should fall over time.
- less rework is better; downward trend preferred;
The operations review is a monthly feedback loop for a number of these metrics.
Sources
- Use Notion » 13 Essential Software Development Metrics to Ensure Quality
- Product Craft » René Rosendahl » Not All Metrics Have to Be Actionable (Gasp!)
- Twitter » Omer van Kloeten » Quantifying Quality
- Focused Objective » Troy Magennis » The Economic Impact of Software Development Process Choice—Cycle-time Analysis and Monte Carlo Simulation Results
- SlideShare » Troy Magennis » Agile 2014 Software Moneyball
- Scrum Log Jeff Sutherland » Jeff Sutherland » Scrum Metrics for Hyperproductive Teams
- Scrum Log Jeff Sutherland » Jeff Sutherland » Happiness Metric - The Wave of the Future
- Dropbox » David J. Anderson » Lean Risk Management—Options, Liquidity & Hedging Risk using Kanban Systems, about Liquidity, WIP, Lead Time, Cycle Time, Process Efficiency.
- InfoQ » Sean McHugh » How To Not Destroy your Agile Team with Metrics
- Pyzdek Institute » Thomas Pyzdek » Gaming the Metrics
- Leading Answers » Mike Griffiths » Smart Metrics Slides
- Martin Fowler » Martin Fowler » An Appropriate Use of Metrics
- The Risk Manager » Chris Matts » Outcome based Process Metrics—A focus on Time to Value
- Agile Alliance » Daniel Vacanti, Bennet Vallet » Actionable Metrics at Siemens Health Services
- Seth’s Blog » Seth Godin » Numbers (and the magic of measuring the right thing)
- Excellence
- Eli Goldratt
- Scott Ambler
- John Seddon
- Kanban
- @AgileSteveSmith
- Bill Sesko
- Eric Ries
- Maynard Keynes
- Albert Einstein
- Unknown
- Seth Godin
- David J. Anderson
- Julia Wester
- Tom DeMarco
- The Lean Startup
- Use Notion
- Product Craft
- René Rosendahl
- Omer van Kloeten
- Focused Objective
- Troy Magennis
- SlideShare
- Scrum Log Jeff Sutherland
- Jeff Sutherland
- Dropbox
- InfoQ
- Sean McHugh
- Pyzdek Institute
- Thomas Pyzdek
- Leading Answers
- Mike Griffiths
- Martin Fowler
- The Risk Manager
- Chris Matts
- Agile Alliance
- Daniel Vacanti
- Bennet Vallet
- Seth’s Blog