It’s never been easy to quantify the value of an education, but if it was going to be done anywhere you’d expect it to be done in the world’s companies, where the vast majority of decisions to spend money have to be backed by the logic that it will result in much bigger increases in revenue or cost savings.
Unfortunately, however, corporate learning and development (L&D) teams aren’t very good at it. As an analyst lamented in Training Magazine recently, “few organizations are collecting metrics that help link learning to organizational and individual performance. In fact, only 6% of companies are truly measuring all different types of learning with an eye on business results. … More than 25% of survey respondents said they are doing only basic measurement or essentially no measurement of all. Approximately half are at what we call the Standardized level, where an array of metrics is collected and basic reports are run, but without much analysis or ability to link learning results to performance.”
As the article goes on to say, the data from a consultancy called the Brandon Hall Group, also shows that companies are bad at measuring the effectiveness of informal learning in particular, and “rely far too heavily on basic metrics such as completion rates and smile sheets.” The conclusion is that if L&D teams want to get better at showing the value of their work, they should stop trying to show the specific return on investment and more on how learning helps achieve business goals.
This final point is the most powerful one. Teams across the HR function will certainly benefit by moving from trying to prove that an x% increase in spending on anything HR related will result in a y% boost in profits (the holy grail of any justification for spending on corporate functions) to showing how, for example, L&D activity has a direct effect on specific business aim (such as ‘set up successful operations in south east Asia by the end of 2018’).
Spend Time on Interpreting Data, Not Money on Fancy Tech
Two-thirds of HR teams, let alone just those in L&D, perform ad-hoc reporting or descriptive analysis, according to CEB data, and many of them believe their next step should be investing in advanced data tools and technologies. However, these investments will yield minimal gains if teams are unable to translate all that data into suggestions and insights that line managers can act on.
The more forward-thinking HR teams ensure they have the right analytics capabilities and processes among their staff before investing heavily in analytics technology. They recognize that they can only reap the full value of technology when they have identified the right business problems and improved the team’s ability to use talent data to solve those problems.
Today, fewer than 20% of leaders believe these efforts to collect and analyze HR data (known as “talent analytics”) focuses on the right business issues. To address this, one big retailer in CEB’s networks aligns its HR analytics efforts to key business questions first, rather than just to what talent data is available.
Line managers throughout the firm then rank the importance of 100 key human capital questions developed by the analytics team, which they then used to define the scope of their work. The analytics team identifies investments for its three-year roadmap by asking, “Do we have the data we need?”and, “Are we able to use the data once we have it?” By ensuring every talent analytics investment targets critical business needs and provides substantial ROI, the company increased line managers’ use of its workforce analytics site by 78%.
What this Means for L&D
Measuring the effectiveness of their services is a perennial and pressing challenge for L&D teams. One of the big challenges is that L&D staff lack the skills required to select and communicate the right metrics. For example, only 18% of line managers think L&D staff are good at communicating the impact of the work the function does. One of the key reasons for this is that, just like their colleagues elsewhere in HR, L&D staff commonly focus on measuring program activity as opposed to understanding the learning or business objectives of the programs.
Digging deeper into this subject at CLO, Sarah Fister Gale discusses what some teams are doing to use data and analysis now available to get the most out of their L&D technology:
“To avoid making costly technology mistakes, many customers are looking for better analytics and metrics to help them assess learning impact. … Customers don’t just want whiz-bang dashboards telling them who took what training, though that is a good first step, [Steve Paul, learning product manager for SilkRoad,] said. “What they really want is data that will give them true prescriptive analytics to understand where they need to make changes.
“Axonify is attempting to address many of these challenges by offering client’s employees daily three-minute learning nuggets, in which they answer questions while engaging in a gaming environment. Based on their responses, the system customizes content for the next day. The technology is based on learning theory that people are better able to retain information when they are introduced to it at random points over time, said Axonify CEO Carol Leaman. These kinds of tools address several key challenges for learning leaders — they provide learning in small chunks, make it easily accessible, and provide key metrics to demonstrate business results.”