Study: Most Companies’ Data Doesn’t Meet Basic Quality Standards

Study: Most Companies’ Data Doesn’t Meet Basic Quality Standards

At the Harvard Business Review, Tadhg Nagle, Thomas C. Redman, and David Sammon present the findings of a study they conducted to assess the quality of data available to managers at 75 companies in Ireland. Using Redman’s Friday Afternoon Measurement method, they asked managers to collect critical data on the last 100 units of work conducted by their departments and mark them up, highlighting obvious errors and counting the number of error-free records to produce a data quality score. “Our analyses confirm,” they write, “that data is in far worse shape than most managers realize”:

  • On average, 47% of newly-created data records have at least one critical (e.g., work-impacting) error. A full quarter of the scores in our sample are below 30% and half are below 57%. In today’s business world, work and data are inextricably tied to one another. No manager can claim that his area is functioning properly in the face of data quality issues. It is hard to see how businesses can survive, never mind thrive, under such conditions.
  • Only 3% of the DQ scores in our study can be rated “acceptable” using the loosest-possible standard. We often ask managers (both in these classes and in consulting engagements) how good their data needs to be. While a fine-grained answer depends on their uses of the data, how much an error costs them, and other company- and department-specific considerations, none has ever thought a score less than the “high nineties” acceptable. Less than 3% in our sample meet this standard. For the vast majority, the problem is severe.
  • The variation in DQ scores is enormous. Individual tallies range from 0% to 99%. Our deeper analyses (to see if, for instance, specific industries are better or worse) have yielded no meaningful insights. Thus, no sector, government agency, or department is immune to the ravages of extremely poor data quality.

The data quality challenge should sound familiar to HR leaders attempting to implement talent analytics strategies.

Our research at CEB, now Gartner, has pointed to data quality as one of the main barriers to developing an effective analytics program, and suggests that the most important thing HR can do to improve the quality of data collected is to build better relationships within the function and throughout the organization, which is more than twice as effective as buying new technology. In a peer benchmarking session at our ReimagineHR conference in London last week, HR leaders identified data quality as a key pain point for their analytics programs that was holding them back from aligning analytics to critical business questions.

CEB Corporate Leadership Council members can review our recent webinar to learn about the most effective strategies for improving talent analytics. Also, in our latest issue Talent Analytics Quarterly, research analyst Fiona Lam looks at how TE Connectivity addressed the data quality challenge by standardizing all data definitions and processes across the business and prioritizing data based on business impact, among other tactics. Members can read Lam’s article here.