Combating Bias in Candidate Evaluation

Kristen Flores
August 26, 2020

pymetrics’ Founder and CEO Frida Polli recently spoke with Carin Taylor, Chief Diversity Officer at Workday, and Merve Hickok, Founder of AIEthicist.org, on combating bias in candidate evaluation during Wade & Wendy’s Recruitment Automation Conference on Diversity and Inclusion. Continue reading below for our key takeaways.


Acknowledging unconscious biases

Everyone has them. As humans, we're wired to make quick decisions and unconscious bias helps us do that. Biases are learned behaviors influenced by personal experiences, those of people we know, the media, etc. In the context of recruiting and evaluating job seekers, these same biases creep into decision making and can have lasting and detrimental effects on hiring outcomes.


All those involved in the recruiting process have to be intentional about combating this. It’s important to look closely at how decisions are made and how candidates are evaluated - everything from job description to resume review to interviewing for culture fit. Take ownership to review and challenge established processes at every level. While bias training cannot overcome unconscious biases, it can bring more awareness to thinking that may seem natural that is actually biased (identifying with someone who went to the same school, has worked at the same company, or has similar interests). 


Measuring bias

It’s not enough to examine the end result of who you hire-- hiring leaders need to examine the whole evaluation process, beginning with how you source and examining who gets through each step of the hiring process. Are the same rate of people getting through each step? Are groups with certain characteristics under or over performing? Look at how the company treats current employees - wages, promotions, current diversity - as this affects who and how you hire. You have to dig in. 


Mitigating bias 

An example of mitigating bias in recruiting is hosting panel interviews instead of 1:1 interviews, as group reviews allow for more perspectives and allow everyone to experience the same responses, making the feedback more objective. 


Another way to mitigate bias is through technology. Technology allows you to analyze datasets in ways you couldn’t before, giving you the opportunity to improve processes and metrics, review company composition, and consistently audit and optimize for fair evaluation. Technology allows you to apply the same filters to all applicants in seconds creating a more even playing field with the ability to reach a larger pool of candidates. While it can be a great tool in fairer evaluation, it’s important to do due diligence before implementing - make sure it solves the specific problems you’re having and that it’s not evaluating biased inputs. Look for technology built with ethical principles that is explainable, auditable, and transparent, and audit before and after launch. 


Ethical, fair technology

Start with who is involved in the design process. Technology will be made on the experience of the creators so it’s important to have a diverse set of designers and voices in the room. Removing bias from tech is a new theme, which requires consistent and constant auditing and revisions. In the way that audited, fair technology can evaluate a large quantity of candidates quickly and fairly, the impact of biased tech evaluations is also magnified. As AI progresses, compliance issues will become more prominent, make sure you’re working to solve these problems in advance with extensive due diligence and quick and proactive action if you encounter adverse impacts in your audits.  At the end of the day you’re deciding someone’s worthiness - a very subjective decision - so make sure you’re doing your best to be fair in this decision.


But what is “fair”?

With the proliferation of AI making recommendations like who to hire, regulation is key. At the moment, a lot is dependent on self regulation - training and advocacy to drive awareness for ethical standards through regulation. Hard regulation can assist in establishing basic standards companies can hold themselves against. To establish fairness, consistency in evaluation and transparency are key. Leaders across industries need to be able to see what’s working and what’s not and use the learnings of others to make better tech for everyone. Even more impactful than what is working is to bring to light what isn’t so it can be solved for. 


Getting smart with bias 

Provide teams with education and bias training, allow them to practice, fail, and improve. Have diverse perspectives in how you’re looking at bias and evaluating it - come at it from different angles. A focus on the behavior practice and training in addition to education is critical to driving meaningful change, as is creating feedback channels to allow applicants and candidates to raise concerns to improve processes as well as the software. 


Want to learn more about how pymetrics can help you create a more fair, less biased recruiting process? Reach out here