igor - Fotolia

Tip

Three measures for capturing the value of data-driven decisions

Here's advice on how to track not just the amount of use of the analytics team's data and insights, but the value of the analytics they bring to the entire organization.

Data-driven decision-making is all about action. It's about looking at information, identifying drivers and trends, and recommending actions. The goal is to get the information into the hands of the people who can act on it. This strategy requires fostering a culture that enables and encourages data-driven decisions.

Furthermore, organizations continue to struggle with establishing the performance measures and ROI of their analytics programs.

Last year, my organization, APQC, conducted a research study to understand the steps required to establish and maintain a data-driven culture; including the measures used to track and monitor success.

To indicate value and monitor success, analytics programs have to show the statistically significant incremental value of analytics and data-driven decisions through measures that matter to the organization. But many organizations struggle to identify the right measures to tie the outcomes of analytics projects to other measures of success.

It's crucial to use measures that decision-makers find important; the ones that help to indicate the impact and value of analytics on the organization as a whole.

Some organizations compared month-over-month or season-over-season results before decisions were supported by analytics, and continue to after -- ultimately proving how data-driven decisions result in better decisions over time.

Best-practice organizations typically use a combination of measures in three primary categories: behavioral change, analytics performance and business performance.

Behavioral change measures

Because adopting analytics and shifting to a data-driven culture is about changing the norms and behaviors of people, organizations should include behavioral change measures in the mix. Such measures help monitor the adoption rates of new norms and practices. More explicitly, they also help the organization monitor the use of analytics to support data-driven decisions. Relevant measures include:

  • Action items, including the number and types of actions taken based on the analytics. This information helps outline the value of analytics and the ability of decision-makers to act upon them.
  • Utilization or consumption, to track the use or download of analytics outputs, either through self-service dashboards or repositories.
  • Number of service requests, both repeat and new requests for projects or analytics. Repeat business suggest there's value in the insights provided for decision-making, while new requests indicate a growing awareness of value to the organization.
  • Number of employees requesting training, which tracks the requests and outcomes of formal and informal analytics training. The information is a gauge of the adoption of analytics skills throughout the organization.

Some organizations measure the consumption and use of their analytics output, which is crucial to understanding which measures decision-makers find useful and which are rarely, if ever, used. Such information not only helps the analytics team track success, but it also helps them refine their measures, analysis and output, and manage how those things are shared with business leaders.

For example, IBM's social analytics team views success by the number of requests for analytics to drive business insights, and whether business leaders are asking for more analytics and taking action on that information. The team has expanded its support to additional functions across IBM.

When the company rolled out a web-based collaborative email system, called IBM Verse, the CIO organization wanted to understand how well the change was going from a sentiment point of view. The social analytics team analyzed which topics IBM employees were talking about in relation to the rollout.

Consumption alone does not indicate ingrained behavior. Organizations must also track the application of their analytics outputs.

For example, Johnson Controls' measures of success focus on whether the team is having an impact (i.e., the degree to which people are able to take action based on its insights). So the company keeps track of the specific actions that people take as a result of the analytics team's analysis. It doesn't keep track of everything the analytics team does; instead, it tracks what internal customers are doing as a result of the team's guidance.

Analytics performance measures

Performance measures instead focus on the efficacy of the program, and are designed to measure how well the analytics program is accomplishing it goals. Measures typically include:

  • Prediction or model accuracy: how close the prediction of the changes in activity X was after the organization took Y actions.
  • A/B comparatives, which compare the differences between the experiment and a control. The organization uses insights from predictive models for a segment or smaller project to test results (A), and uses its old models for the rest of the processes or organization (B).
  • Cost/benefit analysis: measuring the revenue or cost savings of the projects compared to the resource investment.
  • Stakeholder satisfaction, the most qualitative measure, which helps people understand how valuable the analytics contributions are to decision-makers. It will also typically identify areas for improvement or strengths on which to focus.

For many organizations, a cost/benefit analysis or ROI calculation is vital to proving the efficacy of their analytics programs.

For its initial retention project, the analytics team at SAS, a software vendor, explored how much it would cost the company if employees were to leave. Then, it looked at the amount of time spent both developing the project and implementing its results. Finally, it calculated how much money SAS would save compared to its investment if the predictive analysis could help ensure that a quarter of the at-risk population wouldn't leave.

In addition to cost/benefit analysis, organizations also often measure the accuracy of their models' predictions. At SAS, for example, if the model said that, if the organization conducted certain actions, its new hires would be productive within six months, the company could chart how many people actually become productive within six months.

Business performance measures

As noted earlier, it is important to use measures that decision-makers find important -- typically, business performance measures such as revenue, cost, customer retention or cycle time. This way, you measure the difference in a manner that demonstrates the value of analytics.

Often, business performance results are linked to specific analytics projects, but organizations can then roll up the performance improvements, increased revenue or retention across projects to show the overall gain.

For example, laptop manufacturer Lenovo was able to use HR's sales compensation insights to establish correlations among certain metrics of the sales organization and individual sales people, including base pay and engagement levels. Not only did this information provide specific business performance results, such as revenue, but it also helped shift the sales function to adopting data-driven decision-making.

Key lessons for data-driven decision-making

As organizations continue on their analytics journeys and shift to data-driven cultures, they should consider the following measurement practices:

  • Track macro measures to assess the change, and use bottom-up measurements (of individuals or projects) to pinpoint the root causes of any issues.
  • Use a mix of behavioral and performance measures to track success. This helps monitor and improve the efficacy of the program and ensure adoption of new behaviors.
  • Actively communicate measures of success with leadership to reinforce the value of analytics and data-driven decision-making.
  • Stay cognizant of the organization's goals, and be prepared to adjust measures of success as the organization matures.

About the author:
Holly Lyke-Ho-Gland is the program manager for process and performance management research at the American Productivity and Quality Center (APQC), a Houston-based nonprofit that provides expertise on business benchmarking and best practices.

Next Steps

Learn more about measuring business performance

Build a performance dashboard

Why integrating corporate performance management and business intelligence makes sense

Dig Deeper on Supply chain and manufacturing

SearchOracle
Data Management
SearchSAP
Business Analytics
Content Management
HRSoftware
Close