AI mimics recruiter gender bias against women, study warns

Artificial intelligence algorithms used for hiring show a bias for men over women, according to a new study from the University of Melbourne.

The researchers provided resumes to 40 recruiters for data analyst, finance officer and recruitment officer jobs at UniBank. Twenty of the recruiters received resumes with the candidate's gender and the other half received the same resumes with names switched so male candidates appeared to be female and vice versa.

The recruiters ranked the resumes and picked the top three for each open role.

Six findings:

1. Recruiters ranked male candidates higher despite female candidates having the same experience and qualifications.

2. Researchers then created a hiring algorithm to rank candidates based on the recruiter's preferences, and the algorithm produced similar biases for male candidates.

3. The algorithms examined the resume for the candidates' previous job durations, education and other factors, and placed more weight on time spent in a similar role.

4. The researchers noted results for the financial officer position were "the most troubling" because the recruiters preferred male candidates over female candidates regardless of education, experience and match to keywords specified as important.

"For UniBank, for whom financial positions are core business, this may lead to gendered job sorting based on human preference rather than individual ability," the study authors concluded, also stating that the trend is a drain on productivity and human capital maximization.

5. For the data analyst role, women's resumes were ranked higher only when male characteristics were present. The recruiters also valued experience for this role over educational qualifications, which put women who had career disruptions at a disadvantage.

6. The recruitment officer job required only keyword and education to predict candidate rankings. The role is traditionally female-dominated, and the recruiters were less consistent with rankings when the male and female names were switched on the resumes.

"Our results clearly indicate that the human panel holds unconscious bias that introduces gender bias into the models," the researchers concluded. "The machine also has potential to compound this disadvantage by ranking keywords and experiences against gendered language. We also show that men had slightly more experience and women better match to keyword requirements."

The researchers recommended companies provide training programs for human resources professionals about gender bias in algorithms for the hiring process, conduct gender audits in hiring, establish quotas to ensure gender balance and developing proprietary hiring algorithms trained to reduce gender bias.

 

© Copyright ASC COMMUNICATIONS 2021. Interested in LINKING to or REPRINTING this content? View our policies by clicking here.