Appendix: Example Key Value Measures
To encourage adaptability, EBM defines no specific Key Value Measures (KVMs). KVMs listed below are presented to show the kinds of measures that might help an organization to understand its current state, desired future state, and factors that influence its ability to improve.
Current Value (CV)
KVM | Measuring |
---|---|
Revenue per Employee | The ratio (gross revenue / # of employees) is a key competitive indicator within an industry. This varies significantly by industry. |
Product Cost Ratio | Total expenses and costs for the product(s)/system(s) being measured, including operational costs compared to revenue. |
Employee Satisfaction | Some form of sentiment analysis to help gauge employee engagement, energy, and enthusiasm. |
Customer Satisfaction | Some form of sentiment analysis to help gauge customer engagement and happiness with the product. |
Customer Usage Index | Measurement of usage, by feature, to help infer the degree to which customers find the product useful and whether actual usage meets expectations on how long users should be taking with a feature. |
Unrealized Value (UV)
KVM | Measuring |
---|---|
Market Share | The relative percentage of the market not controlled by the product; the potential market share that the product might achieve if it better met customer needs. |
Customer or User Satisfaction Gap | The difference between a customer or user’s desired experience and their current experience |
Desired Customer Experience or satisfaction |
A measure that indicates the experience that the customer would like to have |
Time-to-Market (T2M)
KVM | Measuring |
---|---|
Build and Integration Frequency |
The number of integrated and tested builds per time period. For a team that is releasing frequently or continuously, this measure is superseded by actual release measures. |
Release Frequency | The number of releases per time period, e.g. continuously, daily, weekly, monthly, quarterly, etc. This helps reflect the time needed to satisfy the customer with new and competitive products. |
Release Stabilization Period |
The time spent correcting product problems between the point the developers say it is ready to release and the point where it is actually released to customers. This helps represent the impact of poor development practices and underlying design and code base |
Mean Time to Repair | The average amount of time it takes from when an error is detected and when it is fixed. This helps reveal the efficiency of an organization to fix an error |
Customer Cycle Time | The amount of time from when work starts on a release until the point where it is actually released. This measure helps reflect an organization’s ability to reach its customer. |
Lead Time | The amount of time from when an idea is proposed, or a hypothesis is formed until a customer can benefit from that idea.
This measure may vary based on customer and product. It is a contributing factor for customer satisfaction. |
Lead Time for Changes | The amount of time to go from code-committed to code successfully running in production. For more information, see the DORA 2019 report. |
Deployment Frequency | The number of times that the organization deployed (released) a new version of the product to customers/users. For more information, see the DORA 2019 report. |
Time to Restore Service | The amount of time between the start of a service outage and the restoration of full availability of the service. For more information, see the DORA 2019 report. |
Time-to-Learn | The total time needed to sketch an idea or improvement, build it, deliver it to users, and learn from their usage |
Time to remove Impediment |
The average amount of time from when an impediment is raised until when it is resolved. It is a contributing factor to lead time and employee satisfaction. |
Time to Pivot | A measure of true business agility that presents the elapsed time between when an organization receives feedback or new information and when it responds to that feedback; for example, the time between when it finds out that a competitor has delivered a new market-winning feature to when the organization responds with matching or exceeding new capabilities that measurably improve customer experience. |
Ability to Innovate (A2I)
KVM | Measuring |
---|---|
Innovation Rate | The percentage of effort or cost spent on new product capabilities, divided by total product effort or cost. This provides insight into the capacity of the organization to deliver new product capabilities. |
Defect Trends | Measurement of change in defects since last measurement. A defect is anything that reduces the value of the product to a customer, user, or to the organization itself. Defects are generally things that don’t work as intended. |
On-Product Index | The percentage of time teams spend working on product and value. |
Installed Version Index |
The number of versions of a product that are currently being supported. This reflects the effort the organization spends supporting and maintaining older versions of software. |
Technical Debt | A concept in programming that reflects the extra development and testing work that arises when “quick and dirty” solutions result in later remediation.
It creates an undesirable impact on the delivery of value and an avoidable increase in waste and risk. |
Production Incident Count |
The number of times in a given period that the Development Team was interrupted to fix a problem in an installed product. The number and frequency of Production Incidents can help indicate the stability of the product. |
Active Product (Code) Branches |
The number of different versions (or variants) of a product or service. Provides insight into the potential impact of change and the resulting complexity of work. |
Time Spent Merging Code Between Branches |
The amount of time spent applying changes across different versions of a product or service. Provides insight into the potential impact of change and the resulting complexity of work. |
Time Spent Context-Switching |
Examples include time lost to interruptions caused by meetings or calls, time spent switching between tasks, and time lost when team members are interrupted to help people outside the team can give simple insight into the magnitude of the problem |
Change Failure Rate |
The percentage of released product changes that result in degraded service and require remediation (e.g. hotfix, rollback, patch). For more information, see the DORA 2019 report. |