SHRM’s Roy Maurer recently highlighted a survey from KPMG showing that corporate leaders around the world remain distrustful toward their organizations’ data and analytics when it comes to using these tools to make business decisions:
In the survey of 2,190 senior executives from Australia, Brazil, China, France, Germany, India, South Africa, the U.K. and the U.S., just 35 percent said they have a high level of trust in their organization’s use of data and analytics. Another 40 percent said they had reservations about relying on the data and analytics they produce, and 25 percent admitted they have either limited trust or active distrust in their data and analytics. Nearly all respondents (92 percent) worry about the impact flawed data could have on their company’s business and reputation.
“Executives and managers are being asked to make major decisions based on the output of an algorithm that they didn’t create and don’t always fully understand,” said Thomas Erwin, global head of KPMG International’s Lighthouse, the firm’s center of excellence for data, analytics and intelligent automation. “As a decision-maker, you really need to have confidence that the insights you are getting are reliable and accurate, but many of these executives can’t even be sure if their models are of sufficient quality to be trusted. It’s an uncomfortable situation for any decision-maker to be in.”
One barrier to the credibility of analytics for business leaders is the prevalence of incomplete data; another is that the metrics against which organizations are measuring are often ill-defined. HR metrics like source of hire and quality of hire are particularly hard to measure accurately, Kevin Wheeler, founder and president of the Future of Talent Institute, tells Maurer, and there is significant disagreement on how best to define them.
Data scientists are among the most in-demand professionals in the US right now, as more and more industries look to harness the power of data to drive productivity and innovation to new heights. Demand is high and supply is short, so these experts command remarkably high salaries. However, recent research has suggested that many companies’ data isn’t sufficiently high-quality to produce the kinds of insights managers are expecting. This is particularly true in the emerging field of talent analytics, where companies are making major investments but most aren’t seeing them pay off.
In addition to the data quality challenge, Data Quality Solutions President Thomas C. Redman suggested in a recent Harvard Business Review article that senior managers at many organizations are mismanaging their data scientists: placing them in the wrong part of the organization, not focusing the data science program on business outcomes, and not facilitating a transition to a more data-driven culture. He offers some suggestions for how companies can get more out of their data scientists:
First, think through how you want data scientists to contribute, and put them in spots where they can do so. The worst mistake a company can make is to hire a cadre of smart data scientists, provide them with access to the data, and turn them loose, expecting them to come up with something brilliant. Lacking focus and support, most fail. Instead, clearly define the opportunities you want to address using data science, and put your data scientists in places in the organization where they can best pursue those opportunities. …
Gartner is projecting worldwide IT spending to reach $3.7 trillion this year, a 4.5 percent increase from 2017, with enterprise software expected to be the fastest-growing component of IT spend, growing by 9.5 percent from $355 billion last year to $389 billion in 2018. HR technologies are among the leading drivers of innovation in this space, with significant spending forecast on software-as-a-service solutions in financial management systems (FMS), human capital management (HCM), and analytic applications. Big data, algorithms, machine learning, and AI are among the technologies expected to drive growth in IT investments in the coming years.
(For readers who want to hear more about our IT spending forecast, Gartner analysts discuss these findings in detail in a complimentary webinar, available on demand here.)
For talent management leaders, this information carries significant implications. In the coming years, technology will inevitably be more embedded into the HR function: The only choice for leaders is whether they want to be on the front or back end of the adoption curve. Technology in the HR realm is advancing at a rapid rate, but the function seems consistently hesitant to take advantage of the opportunities and efficiencies it offers. A wide range of tools are newly available or in development that can help solve perennial HR challenges such as candidate vetting, employee wellness, space management, analytics strategy, recruiting and retaining diverse employees, understanding drivers of high performance, making learning more accessible, or offering digital assistants for all employees.
At the Harvard Business Review, Tadhg Nagle, Thomas C. Redman, and David Sammon present the findings of a study they conducted to assess the quality of data available to managers at 75 companies in Ireland. Using Redman’s Friday Afternoon Measurement method, they asked managers to collect critical data on the last 100 units of work conducted by their departments and mark them up, highlighting obvious errors and counting the number of error-free records to produce a data quality score. “Our analyses confirm,” they write, “that data is in far worse shape than most managers realize”:
- On average, 47% of newly-created data records have at least one critical (e.g., work-impacting) error. A full quarter of the scores in our sample are below 30% and half are below 57%. In today’s business world, work and data are inextricably tied to one another. No manager can claim that his area is functioning properly in the face of data quality issues. It is hard to see how businesses can survive, never mind thrive, under such conditions.
- Only 3% of the DQ scores in our study can be rated “acceptable” using the loosest-possible standard. We often ask managers (both in these classes and in consulting engagements) how good their data needs to be. While a fine-grained answer depends on their uses of the data, how much an error costs them, and other company- and department-specific considerations, none has ever thought a score less than the “high nineties” acceptable. Less than 3% in our sample meet this standard. For the vast majority, the problem is severe.
- The variation in DQ scores is enormous. Individual tallies range from 0% to 99%. Our deeper analyses (to see if, for instance, specific industries are better or worse) have yielded no meaningful insights. Thus, no sector, government agency, or department is immune to the ravages of extremely poor data quality.
The data quality challenge should sound familiar to HR leaders attempting to implement talent analytics strategies.
In a breakout session at the ReimagineHR conference hosted by CEB (now Gartner) in London today, a group of several dozen HR leaders came together for a peer benchmarking session to compare notes and discuss common challenges in the field of talent analytics. The attendees at Wednesday’s session had a variety of roles, including some CHROs, some heads of employee experience, HR business partners or other leadership positions within the HR function: Just as in our peer benchmarking session last year, very few identified themselves by title as heads of talent analytics. The diversity of titles and roles in the room illustrates both the breadth of the impact talent analytics is having on the HR function and the fact that many organizations do not have a dedicated talent analytics team.
The discussion centered on several key themes in the sphere of talent analytics and the challenges attendees were facing at their organizations in bringing data analysis to bear on their talent strategies. Enabling the use of talent analytics, making the function more strategic, building analytic capability, and improving data quality were all areas of concern. These are some of the key challenges that came up in Wednesday’s discussion:
Aligning Talent Analytics to Critical Business Questions
Asked where they were primarily focusing their efforts to drive action in enabling the use of talent analytics, a plurality of attendees identified this as their main focus. Some attendees noted that they are gathering robust data but were still struggling to translate that data into actionable insights to solve business problems. Attendees at last year’s session shared the same frustration. To some extent, the degree to which data can be leveraged is a matter of the analytics function’s maturity. One component of solving this problem is ensuring that the data is “clean,” accurate, and helpful in making decisions: As one HR leader remarked, she is often presented with the data that is easiest to gather rather than the data that is most useful.
Benchmarking surveys can be a useful tool to understand how your organization compares to its peers across a variety of metrics, including talent metrics. However, Scott Mondore writes at Talent Economy, some organizations become over-reliant on benchmarks in defining their talent strategies, which “takes away from the value of the metric as a strategic tool.” Rather than chasing potentially arbitrary benchmarks, Mondore, co-founder and managing partner of the human capital analytics advisory Strategic Management Decisions, argues that talent leaders should use their data and analytics capabilities to figure out what talent metrics really matter to the organization’s performance:
Consider that a benchmark is just an average. Thus, the pursuit of outperforming a benchmark is simply a chase to be better than average against a number that may not reflect a true reality — it just reflects your particular vendor’s database. Benchmarks are also subjective. They’re a number that can change when, for instance, a vendor surveys more clients or you switch vendors. If the target is arbitrary and highly fluctuating, why spend time and money aiming for it? Shouldn’t leaders spend time and money focusing on improving metrics that have proven connections to building their business, and not just trying to outscore the average organization?
In June, analytics startup hiQ Labs filed a lawsuit against LinkedIn after the social networking giant blocked hiQ from scraping publicly available data from LinkedIn’s site and accused hiQ of a terms-of-service violation. On Monday, US District Judge Edward Chen granted a preliminary injunction ordering LinkedIn to restore hiQ’s access to its data and stated that LinkedIn’s motivations appear to be anti-competitive rather than focused on the protection or privacy of their users’ data.
“HiQ has raised serious questions as to whether LinkedIn, in blocking HiQ’s access to public data, possibly as a means of limiting competition, violates state law,” Chen wrote, per Thomas Lee of the San Francisco Chronicle. “LinkedIn’s professed privacy concerns are somewhat undermined by the fact that LinkedIn allows other third parties to access user data without its members’ knowledge or consent.”
HiQ’s software helps companies identify employees who are attrition risks by determining how recently and how often they have been updating their LinkedIn profiles. In its lawsuit, LinkedIn argued that hiQ’s use of its public user data violated both the Computer Fraud and Abuse Act (CFAA) and the Digital Millennium Copyright Act, but the judge was not convinced.
“LinkedIn has presented little evidence of users’ actual privacy expectation; out of its hundreds of millions of users, including 50 million using Do Not Broadcast, LinkedIn has only identified three individual complaints specifically raising concerns about data privacy related to third-party data collection,” Chen’s order reads.
Judge Chen gave LinkedIn 24 hours to remove any technology it was using to prevent hiQ from scraping its data. LinkedIn, now owned by Microsoft after a $26.2 billion acquisition last year, says it plans to challenge the decision.