Can AI reduce gender bias during recruitment?

Posted on Friday, June 21, 2019 by Abby Draper

By Abigail Draper, OSCPA communication & engagement manager

AI hiring programs have the potential to reduce gender bias in recruiting, but only if data including human bias isn’t added into the mix. 

In a Harvard Business Review article, Dr. Tomas Chamorro-Premuzic said, “Most companies focus on the wrong traits, hiring on confidence rather than competence, charisma rather than humility, and narcissistic tendencies rather than integrity, which explains the surplus of incompetent and male leaders.” Person holding tablet with face drawn on it.

Chamorro-Premuzic said this creates a disconnect between “the qualities that seduce us in a leader, and those that are needed to be an effective leader.”

Not only do women more commonly have better leadership characteristics, according to Chamorro-Premuzic, but companies with diverse management teams do better.

A 2018 study by the Boston Consulting Group discovered, “Companies that reported above-average diversity on their management teams also reported innovation revenue that was 19 percentage points higher than that of companies with below-average leadership diversity.”

A study from 2007-2017 by McKinsey and Company showed that worldwide, “Only 43% of men in senior management positions strongly agree that women are as good leaders as men versus 76% of women.”

The research shows that women can be strong leaders and diverse teams generate more revenue, so why aren’t companies hiring more people from minority groups? The answer is bias, and many think AI can help reduce that in the hiring process.

AI can only help, though, if we do not include data created by human bias. For example, according to Forbes, Amazon was using a recruitment tool last year that began showing bias against women. The system scanned resumes submitted over the past 10 years to learn what Amazon was looking for in potential candidates. However, since most of the applicants had been men because there are more men in the tech industry, the system learned to prefer male over female candidates.

Bias can easily sneak in through situations like those, so it’s important for humans to intervene and make sure to exclude certain data and diversity metrics in resumes an AI system is reviewing, said Forbes.

What are some other ways AI can be used to eliminate gender bias? Chamorro-Premuzic said while in-person interviews are hard to standardize, video interviews allow AI to observe a candidate and compare him or her to past interviews and determine how the way they act and their answers would fit into their desired role. The system would not know or pay attention to the interviewee’s gender.

The Society for Human Resource Management (SHRM) said some AI systems also have candidates complete games that “test for traits such as short-term memory and planning and responding to job-related tasks.” The AI then gives recommendations to the employer about which candidates would do well in the position based on their game results. These results have nothing to do with gender.

After suggesting potential candidates, some AI systems can also generate non-biased questions for a human interviewer to ask the narrowed-down group.

DK Bartley, senior vice president and head of diversity and inclusion at Dentsu Aegis Network told SHRM, “If, at the end of the day, you utilize AI and you did not end up hiring that diverse candidate, that's OK because you have diligently followed the process," he said. "You've interviewed the right pool of candidates, and for you, the recruiter, the perception is that this was the best candidate. The system will support your decision by showing that all the other candidates did not have all the skill sets you require. At the end of the day, AI holds everybody accountable."

If AI hiring systems are given correct and unbiased data, they may be used to vet initial applicants without implicit gender (or racial, ethnic, sexual orientation, disability, etc.) bias. As this type of hiring is still a work in progress, it is important that we all pay attention to our own biases and processes in place at our companies that allow bias to occur. AI can only work as well as the humans who make it.

Kieran Snyder, CEO of Textio, an AI hiring program told SHRM, “Textio is not going to help people who are consciously biased. If you are actually a bigoted person, there is no software in the world that is going to make you not bigoted.”


Leave a comment