Is Automated Bias Keeping You From Finding Top Candidates?

Anyone who has applied to a job on a company website in the last few years has probably had their resume passed through a resume-screening program. Whether the applicant knows it or not, these programs use algorithms to sift through applicants, meaning that many resumes, however stellar the candidates may be, may never even see a pair of human eyes.

These programs act as gatekeepers to sift through the deluge of resumes that often flood the inboxes and job websites of companies looking to hire new talent. The programs are supposed act as an objective gatekeeper and impartially allow only the most qualified and relevant jobseekers’ resumes to get in front of those human hiring managers and HR professionals.

However, National Public Radio’s Science Friday program recently featured a discussion of hidden bias in algorithms designed to screen resumes and job applications. During the segment, data researcher Kate Crawford argued that big data sets and the algorithms that sift through them can be just as biased as human beings.

Big data sets may seem to provide a platform for unbiased recruiting, because we assume statistics should dictate that any inherent bias is diluted by the sheer number of data points involved. However, those data sets are the product of our society and culture, which means they can carry along the very same biases inherent in our everyday lives.

One example Crawford gives is algorithms that take an applicant’s address into consideration when screening for new hires. If an applicant lives far away from the workplace, data suggests they are more likely to leave or be fired within a year than applicants located closer to work. However, where a person lives is often highly correlated with other factors, such as race and socioeconomic status.

Fortunately, there are means by which to overcome the potential bias of such programs. NBC News recently ran a story by Julianne Pepitone called “ Can Resume-Reviewing Software be as Biased as Human Hiring Managers? ” In the story, Pepitone discusses research by University of Utah associate professor Suresh Venkatasubramanian in which a test was designed to use a machine-learning algorithm to determine whether a resume-scanning program had the potential for bias. It works like this: if Venkatasubramanian’s program is able to use the data provided to the screening algorithm to accurately predict an applicant’s race, gender or other factors that have been explicitly hidden, there is a potential for bias. If such a potential is discovered, the data can be redistributed to mitigate this potential.

The lesson here is that eliminating bias is not an easy task. Even when using advanced computer programs, bias can find ways to creep into our decision-making. This means it’s that much more important to remain vigilant in heightening our awareness of hidden sources of bias.

Is your organization at risk of missing out on top talent due to unconscious automated bias? Maybe it’s time to put the human factor back into resume review. Be inclusive!