All human beings harbor conscious and unconscious biases. Human brains recognize and learn from patterns in the environment, and this can help us be more efficient as we learn, think, and ultimately make decisions. Consider this: if your favorite singer endorses a particular brand of toothpaste, you may have more positive associations of that toothpaste and consequently change up what you’re already using and buy it instead. You may have seen an advertisement of the singer promoting the toothpaste and recognize it in the store and subconsciously associate it with her upbeat music. Or, you might consciously go out of your way to buy it because you trust her judgment and want a bright smile like hers. Either way, you are biased towards buying the toothpaste she’s promoted because of the association you now have with it, which may lead to you finding your new favorite toothpaste, or impair your ability to choose the toothpaste that’s actually best for your dental hygiene and within your budget.
Bias can lay the groundwork for stereotyping and prejudice, which sometimes we’re aware of (conscious) and sometimes we’re not (unconscious). Unfortunately, the current patterns of bias that exist in the workplace specifically are reinforced in the ways we think and the way we hire. The only way to guard against unfair decision making caused by unwanted conscious and unconscious biases is to take careful and deliberate steps to remove bias from the process. Insights from behavioral science and innovative technology provide some solutions, but first let’s explore where and how such biases emerge.
In his critically acclaimed book, “Thinking Fast and Slow” economist and psychologist Daniel Kahneman brings us closer to understanding why humans make errors in judgement, and how to look for signs that you yourself are about to make such an error. According to his explanation, we have a Two-System way of thinking — System 1 (Thinking Fast), and System 2 (Thinking Slow). System 1 is the impulsive, “gut reaction” way of thinking and making decisions, while system 2 is the more analytical, “critical thinking” way of thinking. System 1 leads us to first impressions and often jumping to conclusions, while System 2 enables us to reflect and problem-solve.
However, in countless other cases, leveraging these mental shortcuts based on incorrect or unfounded information can lead to forming prejudices against harmless others.
Most of us will identify with System 2’s way of processing, however our brains actually spend the majority of time in System 1, unless we encounter something unexpected, or if we make a concerted effort to “think slow”. Fortunately, System 1 is generally very accurate, but there are situations where it can make detrimental errors of bias. This system sometimes answers easier questions than it was asked in seeking to quickly create a coherent, plausible explanation for what is happening around you by relying on associations and memories, pattern-matching, and assumptions. And our brains will believe this plausible, convenient story — even if it is based on incorrect information. In this sense, perception becomes reality. This might be advantageous if you were once approached by a large barking dog who decided to bite your finger, and months later you see a dog of a similar size approaching you while barking and decide to run away. This is your mind’s way of ensuring optimal safety and protection from danger. However, in countless other cases, leveraging these mental shortcuts based on incorrect or unfounded information can lead to forming prejudices against harmless others. This ultimately leads to discrimination, the unjust treatment of different categories of people or things, especially on the grounds of race, age, or sex.
How can such bias be detrimental in the context of hiring? The nature of the information in a typical resume makes it significantly more likely that members of certain demographic groups receive a favorable assessment in the process. Prestigious degrees, for example, are not equally distributed across racial groups or socioeconomic groups. SAT scores and GPAs also inevitably favor certain applicants, as black students are nearly three times as likely to have a GPA below 2.5/4.0 than their white peers. Therefore, relying on these resume cues to predict a new hire’s success will cause your pool of prospective candidates to be shrunken and skewed away from people of color and less affluent upbringings.
But even when a job candidate from a minority background is successful in creating an application that matches the standards of this system, the reality is that bias still persists in talent selection. Researchers have identified the effect of a candidate’s demographic identity on their job search prospects through a setup called the call-back study. In these experiments, equal resumes are submitted to a job posting, but the racial and/or gender profile of the candidate is manipulated by changing their name. For example, in 2015, researchers at Yale found that professors looking for undergraduate STEM research assistants preferred “John” over “Jennifer” when the applicants were otherwise identical. In 2017, a group of economists aggregated 30 years of such studies in a meta-analysis to determine whether employment bias against ethnic minorities has declined in recent history. Thus, as candidates perceived as white are continuing to receive a staggering 36% more callbacks for interviews than those perceived as black and 24% more than those perceived as Hispanic, we cannot say that progress toward racial equity has yet occurred.
Such research showcases the inevitable consequence of human cognition: no matter the data available, biases impair a recruiter’s ability to read a resume and objectively evaluate employment potential. Biases are the shortcuts used by the human mind to process information, based on expectations that are shaped by personal experiences and social stereotypes. Especially because the average recruiter spends no more than seven seconds scanning a resume, which is really only enough time for System 1 to activate and influence the decision made, it is virtually impossible for a manual evaluation process to result in fair hiring assessments.
No matter the data available, biases impair a recruiter’s ability to read a resume and objectively evaluate employment potential.
Fortunately, AI has shown significant promise for eliminating such bias because unlike humans, it can be actively audited and de-biased before allowed into production to ensure that it is not disproportionately recommending certain demographic groups over others. Furthermore, AI can assess the entire pipeline of candidates, unlike humans who are time constrained and therefore tend to only focus on a select group of “top” candidates. If we take the critical steps necessary to address the concerns raised above, we can harness the power of technology to diversify the workforce for good.
To learn more about how pymetrics has set out to curb bias with our AI-driven approach, please visit our Mission page here.