Menu

CEB Blogs

Topics

Recruiting

How to Use Machine Learning to Eliminate Human Biases

Machine learning can not only speed-up recruiting, it can also find excellent candidates that would often be missed by human recruiters; as with all technology, however, the trick is providing the right human oversight

Throughout all the uncertain economic news of the past few years, one trend has stayed constant: it’s increasingly hard to find and attract good candidates for many of the roles that global companies require.

This competition for the right people has pushed HR and business-line leaders alike to be more discerning in their candidate selection, and to find new approaches to assessing an applicant’s likelihood of success. And the recent use of algorithmic assessments and machine learning has helped recruiters screen large volumes of applicants.

But, as with any technology, it still requires human oversight. When using these assessment methods, managers should still ask themselves the following questions:

  1. How can I link the appealing traits in a candidate’s application to specific business outcomes (such as, “help improve our customer service processes”)?
  2. Which of these outcomes should I focus on?
  3. Can I make these predictions in an unbiased way and that does not harm the recruiting process?

What Makes Algorithms Useful

Some business commentators have also questioned how useful this approach to recruiting is. A recent Fortune article argues that because humans create these models, the algorithms themselves will be inherently biased.

However, the article fails to recognize that the real villain of poor outcomes is the methodology behind the algorithm creation, not the use of algorithms themselves. In truth, proper use of algorithms can be the hero of the story, removing existing, human-introduced biases.

The article claims that if the role of the “ideal algorithm” is to find the candidate that human recruiters wanted then it would stand to reason that, if the human process were biased, the algorithm would perpetuate that bias. But, in reality, the ideal algorithm will predict an objective “post-hire outcome,” such as reducing the time taken on customer service calls without reducing customer satisfaction.

This removes the potential for mimicking human bias in hiring and — just as critically — allows the technology to improve the quality of the labor force. What gets you hired is not always what makes you good at your job, but algorithms (unlike people) are good at finding the difference between the two. For example, a recent model created for a call center representative role revealed that call center experience was actually predictive of poor performance. This is counter-intuitive, but the platform was able to identify this quirk whereas the human mind would never think to test this.

And Why You Need ‘Back Tests’

Using technology to predict which candidates are most likely to help a team achieve a business goal is not nearly as hard as predicting a post-hire outcome – such as a sustainable reduction in customer call-handling times – based on data from candidates’ resumés and other application information. Machine learning techniques can help with this problem, and this will be explored in future posts.

However, even if recruiting teams use algorithms that focus on a post-hire outcome rather than a desired candidate trait, it’s still possible to make mistakes. For example, if a company favored computer science degrees as highly predictive of success, it might unintentionally turn away a disproportionate number of female applicants, given their relative scarcity in the field. That is where being able to “back-test” an algorithm is critical.

An algorithm back test uses application data to score historical applicants and track their demographic distribution (along with their subsequent success).

In all, just as Peter Cappelli warned in a Harvard Business Review article: “If HR is to set the agenda on people management, it must either staff up to handle [data] analyses itself or partner with people who can do the work.” Organizations must either fully understand the methodology used in creating algorithms or hire experts to avoid legal implications.

More On…

  • CEB Sunstone Analytics

    Learn more about CEB's resumé-based predictive analytics platform that uses machine learning and advanced statistical modeling to back-test all algorithms.

2 Responses

Leave a Reply

*

 

Recommended For You

Why You Need Strategic Recruiters

Companies want recruiters that can do more than run an efficient recruiting process. Strategic recruiters...

Close