Is hiring really biased?

Jordan Ingersoll
August 10, 2020

The recording of this live session can be found here

pymetrics partnered with PowerToFly to support their first-ever virtual diversity summit, Diversity Reboot 2020. Over 200 speakers and 10,000 attendees came together virtually over the course of the event to discuss the rebuilding of an inclusive workforce post-COVID 19. It is imperative, now more than ever, to focus on diverse teams and perspectives. pymetrics’ CEO, Dr. Frida Polli, hosted a session about technology that can reduce bias in hiring, sharing her extensive knowledge on the legislation and decision-making behaviors that have amplified bias in the HR space and beyond. She breaks down which technologies actually work to match people to their best jobs while removing gender and ethnic bias from the hiring process. Continue reading more from her session below:


Bias is a part of everyday life. No matter how hard we try to eliminate unconscious biases from influencing even the most mundane of decisions, we likely won’t get anywhere. They are deeply ingrained and can even be helpful when making certain quick decisions -- what to eat for breakfast, which toothpaste to buy, which road to take to work, etc. But the unfortunate result of bias is that it can also severely impact the potential that we as humans have. Bias puts the whole world into one very small box of what is “ideal” or what is “desired”. Whereas if we were to broaden this box, or even get rid of it all together, we as a society could achieve much greater things. 


Perhaps not surprisingly, bias plays a significant role in the hiring process, as it impacts who actually gets the job, and who doesn’t as a largely human-led process. pymetrics’ primary mission is to eliminate this bias in internal and external hiring. With a set of gamified exercises that give recruiters a window into applicants’ cognitive, emotional, and social attributes, pymetrics can strip away any gender or ethic variables that may skew who in your pipeline is recommended for a role -- it is purely determined by potential and fit.


But why is such a technology necessary in the first place? The Pipeline Effect is something that is still very present in hiring today. It refers to the fact that who applies for a given job isn’t totally random. Factors that affect who applies include what language the job description is in and how it is worded, both potentially turning away certain groups of people. You likely may not even know that any bias or exclusivity is present in a job description, but it may exude exclusivity or prestige in a way that is very deterring to certain minority groups. We should strive to make the language in all job descriptions more inclusive to attract the best and most diverse talent off the bat. A way to combat this is through data-driven text recommendations for job descriptions, brought to recruiting teams by platforms like Textio. This tool can lead to a 12%-30% increase in the number of women applying for jobs, and the percentage of applications from underrepresented groups increasing from 8% to 10%. The actual changes made to job descriptions can be quite subtle, but outcomes are still significant. 


Another reason technologies like pymetrics are essential pertains to the disproportionate selection of white applicants over minority applicants. There does exist a rule aimed at enforcing fairness in hiring, legally defined as the 80% rule. For every 10 candidates you hire from one group - in most cases white candidates - you have to also select no fewer than 8 in another group. Unfortunately the two most common hiring screens don’t follow this rule. Firstly, cognitive testing screens in only 3 black candidates for every 10 white candidates. This should hardly come as a surprise since this method has been known to be biased since it was first created. Secondly, human resume review is better than cognitive testing as this method on average screens in 7 black candidates for every 10 white candidates -- but still does not meet the standards of the 80% rule. 


Then you have pymetrics, whose Audited AI screens 9 black applicants for every 10 white applicants. This is 20% better at screening in qualified black candidates than human resume review, and 180% better than cognitive testing. This goes to show that it is totally possible to meet these standards -- you can even surpass them. Most just don’t, whether that is due to them not having the resources to it, not being trained in it, or simply because they don’t want to. Whatever the case may be, it does not make it acceptable that the 80% rule is not always followed, seeing as the consequences for certain communities are dire. 


Lastly, in 2017, a meta analysis of 24 call-back studies from the last 30 years was conducted, examining how employers respond to the same resume under different names (i.e., to signal gender or ethnicity). It is widely known that if you have a more “ethnic” sounding name you are less likely to get the job, or even a call-back compared to those with “white” sounding names. This affects people of color as many change their name to sound more “white” to increase their likelihood of getting the job. The studies revealed that even if they have the same qualifications as the person with the “white” sounding name, they are significantly more likely to be overlooked and not receive an equal opportunity to proceed in the interview process. 


An example cited is that of John to Jamal: John got significantly more call-backs than Jamal did -- and their names were the only thing different on their resumes. The result of this analysis over the span of decades reveals that we have not made progress on reducing racial discrimination in hiring.   


As Frida mentions, AI has shown significant promise for eliminating such biases because unlike humans, it can be actively audited and de-biased to ensure that it is not disproportionately recommending certain demographic groups over others for jobs. Furthermore, AI can assess the entire pipeline of candidates, unlike humans who have restrictions on their time and tend to only focus on a select group of “top” candidates- - whether that be determined by their name, where they went to school, etc. If we take the necessary steps to address the concerns raised above, we can harness the power of technology to diversify the workforce for good. 


To learn more about our de-biasing methodology, please visit our Science page here.