The digital transformation of learning and development offers HR leaders new opportunities to embed learning within their talent strategies and make the business case for L&D investments crystal clear. Part of the promise of digital learning comes with the application of data and analytics, enabling organizations to measure and communicate the impact of these programs more precisely than ever before. Unfortunately, as with all new technologies, the rapid emergence of new options can be overwhelming, not every solution is right for every business, and adopting a technology without a clear understanding of how it will generate value can be a very expensive mistake.
To survey this new landscape of learning analytics, Justin Taylor, Director, Talent Solutions at Gartner, moderated a panel discussion at our ReimagineHR conference in Orlando on Monday, bringing together Patti Phillips, Ph.D, President and CEO of the ROI Institute; Dave Vance, Ph.D, Executive Director of the Center for Talent Reporting; and Kimo Kippen, a former Chief Learning Officer at Hilton. The conversation covered the range of new technologies emerging in this space, the opportunities they provide, and the challenge of figuring out how to take advantage of those opportunities.
When considering an investment in learning analytics, the L&D function should keep a few strategic considerations in mind. Based on Monday’s discussion, here are a few of the key questions leaders should ask themselves:
What is your objective?
There are a number of technologies currently on the market that apply analytics to L&D in different ways and to different ends. There’s adaptive testing, in which training modules and skill assessments automatically adapt to each individual’s level of ability. Learning record stores and xAPI record and track learning experience data, allowing organizations to track the progress of learning employee more closely and draw more insights from that data. Learning experience platforms offer new ways of delivering learning to employees on an individualized, self-directed basis. Natural language processing, machine learning, and augmented and virtual reality are also finding applications in learning.
With all these options out there, the panelists agreed, it’s important for an organization to identify just what they hope to get out of learning analytics before buying a new piece of enterprise technology. Don’t chase a shiny toy, Kippen advised, but ask what the business objective is and whether the investment is worth it. You might find that the extra dollar is better spent on fundamentals, Vance added, as new technology won’t fix more fundamental problems in your L&D program. “Without algebra,” he analogized, “you’re not ready for the calculus.”
With any learning solution, it’s essential to have a plan for demonstrating its impact, Phillips emphasized. These high-tech tools can do many things, but what they don’t do is show cause and effect to prove the importance of learning in general. Thoughtful, research-based design and implementation make a difference in winning support for learning analytics initiatives and ensuring their success. This can be challenging, Vance noted, because personalized learning technologies are still new and businesses have not yet developed the measures to realize their full potential. Individual data can help illustrate the impact, he added, by showing how employees are using learning and how it is helping them improve in their day-to-day jobs.
How will you communicate the business case?
Rightly or wrongly, CEOs and CFOs are primed to be skeptical of the value of learning. Top-level executives who measure everything against the bottom line can be tough customers for investments that lack a clear, direct return. Therefore, the panelists stressed, it’s important to make the argument for learning analytics to these executives in a language they understand. That advice can be applied as literally and tactically as using words business leaders like, Kippen noted, and avoiding training-speak, which often fails to resonate. Knowing the business and what it needs is essential, so that the learning analytics initiative can be designed and framed around those needs.
The challenge of the business case goes back to our first question about objectives. Organizations don’t need more training or new technology, Phillips said; what they need are improvements in output, quality, cost and time. Just because you can implement a new learning technology solution, that doesn’t mean you should. Think about what measures matter to your organization, she recommended, then determine which of these measures can be improved by changes in behavior and what people need to learn to make those changes. The business goal should be part of the process from the very beginning, Vance stressed; designing objectives around these measures will help to establish a solid foundation of credibility. So will partnering with other parts of the business to design the initiative around their real needs and capture their support.
This commitment to credibility and business results should remain firm as your learning analytics program gets under way. Learning leaders should have good answers for what they are doing, how they are doing it, and why they are doing it that way, Phillips added. This means having clear standards of success, judging outcomes rigorously against these metrics, and making sure you really understand what your data is telling you, she emphasized. “We never want to put data out without knowing what we’re talking about.”
How will it help learners?
Just as L&D needs to ask how an investment serves the business, it must also consider how it will serve employees. A learning program or platform has no value unless employees actually use it and benefit from it. User experience is a key success factor, Kippen remarked: Employees still spend too much of their workday searching for information they need to do their jobs, so a good, broad objective for any learning investment is to make sure employees can get the answers they need in the right place and time. To make adoption successful, solutions should be easy to use, both for the L&D function and for learners themselves, Phillips added.
Thinking about the learner experience can help in deciding which solutions to pursue first. Of the learning analytics technologies listed above, adaptive testing has the most immediate payoff, Phillips noted, as it can help ensure you have the right people in the right learning programs, providing training for people who really need it, while clearing the way for others to move forward in their career paths. As our research at Gartner shows, learning can be a useful tool for employee engagement and retention when employees see opportunities to grow and advance in your organization.