SHRM’s Roy Maurer recently highlighted a survey from KPMG showing that corporate leaders around the world remain distrustful toward their organizations’ data and analytics when it comes to using these tools to make business decisions:
In the survey of 2,190 senior executives from Australia, Brazil, China, France, Germany, India, South Africa, the U.K. and the U.S., just 35 percent said they have a high level of trust in their organization’s use of data and analytics. Another 40 percent said they had reservations about relying on the data and analytics they produce, and 25 percent admitted they have either limited trust or active distrust in their data and analytics. Nearly all respondents (92 percent) worry about the impact flawed data could have on their company’s business and reputation.
“Executives and managers are being asked to make major decisions based on the output of an algorithm that they didn’t create and don’t always fully understand,” said Thomas Erwin, global head of KPMG International’s Lighthouse, the firm’s center of excellence for data, analytics and intelligent automation. “As a decision-maker, you really need to have confidence that the insights you are getting are reliable and accurate, but many of these executives can’t even be sure if their models are of sufficient quality to be trusted. It’s an uncomfortable situation for any decision-maker to be in.”
One barrier to the credibility of analytics for business leaders is the prevalence of incomplete data; another is that the metrics against which organizations are measuring are often ill-defined. HR metrics like source of hire and quality of hire are particularly hard to measure accurately, Kevin Wheeler, founder and president of the Future of Talent Institute, tells Maurer, and there is significant disagreement on how best to define them.
At the Harvard Business Review, Tadhg Nagle, Thomas C. Redman, and David Sammon present the findings of a study they conducted to assess the quality of data available to managers at 75 companies in Ireland. Using Redman’s Friday Afternoon Measurement method, they asked managers to collect critical data on the last 100 units of work conducted by their departments and mark them up, highlighting obvious errors and counting the number of error-free records to produce a data quality score. “Our analyses confirm,” they write, “that data is in far worse shape than most managers realize”:
- On average, 47% of newly-created data records have at least one critical (e.g., work-impacting) error. A full quarter of the scores in our sample are below 30% and half are below 57%. In today’s business world, work and data are inextricably tied to one another. No manager can claim that his area is functioning properly in the face of data quality issues. It is hard to see how businesses can survive, never mind thrive, under such conditions.
- Only 3% of the DQ scores in our study can be rated “acceptable” using the loosest-possible standard. We often ask managers (both in these classes and in consulting engagements) how good their data needs to be. While a fine-grained answer depends on their uses of the data, how much an error costs them, and other company- and department-specific considerations, none has ever thought a score less than the “high nineties” acceptable. Less than 3% in our sample meet this standard. For the vast majority, the problem is severe.
- The variation in DQ scores is enormous. Individual tallies range from 0% to 99%. Our deeper analyses (to see if, for instance, specific industries are better or worse) have yielded no meaningful insights. Thus, no sector, government agency, or department is immune to the ravages of extremely poor data quality.
The data quality challenge should sound familiar to HR leaders attempting to implement talent analytics strategies.