We all have natural biases, and we don’t always know when they’re affecting us. Things get trickier when you’re a recruiter, monitoring your own thought process to hire without discrimination. With the advent of AI Applicant Tracking, are we looking at a future where machines can help us with that?
First, some context. I recently attended a recruiting event, focused on diversity in hiring. Barring the usual dreaded group activities, forced corporate roleplay and all, I did learn something about bias in recruiting: it’s more common than we think.
But it’s not malicious. More often than not it’s perpetrated unconsciously by well-meaning folks who enthusiastically attend meetings about non-bias hiring, who believe and champion the idea that diversity within a company is important.
One recruiter faced the group and described an experiment he conducted, himself as the guinea pig. He would cover applicant details on a CV like name, age and gender with post-it notes, not thinking it would make a difference in his hiring process. His conclusion?
“I use a lot of post-it notes in my hiring process now.”
This man had attended diversity seminars and conferences on non-bias hiring before. He thought it was enough to be aware of the issue, but as it turns out, you have to do something proactive.
HIs sticky-note approach, rudimentary though it may be, is what’s at the heart of most HR technology addressing negative bias. It’s as simple as asking ourselves what data we need in order to make the best decision.
Ok not quite so simple. Because recruiting, at its core, is itself a process of bias. Selecting few from many based on candidates’ most useful qualities for a particular job. A positive bias.
The problem is when lists include qualities that aren’t about the probability of success, like age, gender, sexual orientation, race, etc. Recruiters and hiring managers have to untangle their positive and negative biases, which is harder than it sounds. These are viewpoints they’ve been using their whole lives. And rather than address it, and take time, effort and introspection, it’s tempting to push blame down the line.
A common excuse from recruiters is that no women, POCs, etc, even apply to certain roles, so it’s not their fault if their choices are seen as less than ideally diverse. It’s true that training and outreach can help promote under-represented peoples in a particular trade, but it’s important to take steps at every stage in the candidate pipeline, because people need jobs now, and recruiters are the gatekeepers.
So beyond purposely withholding certain details like our friend at the event, recruiters and hiring managers can also reduce bias by creating a system of checks, which could include writing out a list of qualities a candidate should have for the job, and making sure that those are actually the qualities on which they are judged.
While this simple method of self-accountability can be quite effective, the list itself could be flawed and that’s where we need technology.
Up until now we’ve used algorithms that often adopt the bias of an institution, or the people doing the hiring, because the calculations are based on that institution’s single data source. What makes our technology different is the vast amount of data we collect from a variety of industries, which allows us to find appropriate candidates from unexpected places, candidates you never may have known existed, or have considered before.
The AI sources and selects top candidates to interview far faster than any human could, and with less bias. The idea of machines doing a job better than people is always off-putting at first, but think of what you could do with all that saved time: actual face time with candidates. Time to nurture connections. Build a network your company can draw from in the future.
SmartRecruiters AI aims to address the problem of human bias, and let the machine take care of that part of the process. And when the hiring manager is shown their shortlist, our product will then hold humans accountable. Our collaborative approach adds community checks and balances to the recruitment process, which means platform administrators will be able to see the work of their hiring team, which means if a particular manager hires a white male over a black female who scored higher, they will have to justify this to their coworkers, their superiors, and themselves.