Culture

This AI could prevent hiring biases instead of perpetuating them

By studying the search behavior of actual recruiters, the system noted patterns of discrimination.

3d portrait of a man with glitch effect. Isolated on dark background
Shutterstock

There's a growing and legitimate worry that algorithms are not suited to pick out job applications from humans. At least, not fairly or even accurately. But what if this was flipped over and a bot could, through copious amounts of machine learning, show that human recruiters are just as biased? If not worse.

That's what researchers at the London School of Economics sought to find out in a recent study. By using artificial intelligence to evaluate the Swiss public employment program, their tool found that certain minority groups as well as women were left behind compared to their other peers. Their names, background information, and other details were met with subconscious bias from recruiters. But there's good news. Researchers believe that a system like this — which is "widely applicable, non-intrusive, and cost-efficient" — could help us fix the problem by alerting us where we slip.

Background — According to the report, researchers relied on the system to sift through a huge pile of data. Specifically, the system analysed 452,729 search entries coming from 43,352 recruiters. Furthermore, the system looked at the 17.4 million job profiles that came out as search results. It even dug into 3.4 million profile views. All of this gave the machine an opportunity to create discernible and meaningful takeaways from how human recruiters select or drop potential job candidates. Most importantly, the machine studied whether or not a recruiter followed up after viewing an applicant's submission.

Findings — The results confirm the common apprehension that ethnic minority groups as well as women have to work considerably harder than the opposite peers when trying to find employment. The system reported that human recruiters were 19 times less likely to contact an applicant who had an ethnic minority background even if the applicant had the right kind of qualification. Additionally, the system found that women, compared to men, were more likely to be snubbed even when they had the required skills for a job.

Where we go from here — Researchers believe that there is a way to navigate around and reduce such bias. One of their recommendations is to have applicants mention their background details like names at the lower end of their resume and to keep their skills and accomplishments in the upper half. But they also think that machines like this one could prove to be useful if recruiters introduced them into their own hiring process. What the human eye misses, the bot can pick up and point to.