Human recruiters, despite their best intentions, can harbor unconscious biases that affect their decision-making. Factors such as gender, ethnicity, age, and educational background can inadvertently influence perceptions of a candidate’s suitability. What is perhaps most common is affinity bias, which can often be justified by recruiters as a good ‘cultural fit’ for the organisation.
Artificial intelligence (AI) has emerged as a pivotal tool in mitigating biases inherent in traditional recruitment processes. When meticulously programmed to prioritise candidates’ skills, experience and qualifications, AI can significantly diminish the influence of unconscious biases, leading to more equitable and effective hiring outcomes.
Of course, the reverse is also true. If there is inherent bias in the programming of AI then existing human bias can be baked into your AI model, reinforcing existing recruitment behaviours and biases. This problem is particularly prolific when training large language models using ‘successful’ candidate profiles or when applying machine learning techniques that ‘learn’ from recruiter behaviour.
Position-specific, skill-based assessment
AI systems, when designed with a focus on objectivity, can assess candidates based on skills, experiences and qualifications as they relate to each individual role that is taken to market.
These assessments can evaluate technical abilities and soft skills, ensuring that hiring decisions are grounded in actual capability rather than subjective assessments of inconsistently applied variables.
Mitigating bias in AI systems
Human influence over hiring decisions is unlikely to be eliminated, nor should it be. However, Hiremii’s AI technology utilises a range of strategies to ensure recruiter bias is minimised whilst maximising control provided to the talent acquisition team. These strategies include the following:
- Understanding context: Hiremii understands that the terminology used to describe roles and skills often changes based on context. A Process Engineer in mining does not have the same skills as a Process Engineer in manufacturing, construction or energy. Similarly, the terminology, tools and skills used to describe capabilities vary in each circumstance. For this reason, Hiremii ensures the industry context is well understood before AI is even applied.
- Decoding the Position Description: Before assessing candidates Hiremii pushes recruiters to detail and prioritise the skills required in the position description. This process is very uncommon in many AI-enabled applicant tracking solutions that often seek to identify, match and score keywords from the position description and CV, without explicit confirmation of requirements with the recruiter.
- Transparent weighting and scoring: Even after a recruiter confirms the details of a position description and prioritises skills there can be a need to fine tune or even completely redefine the brief after a role has been advertised. For this reason, Hiremii’s shortlisting technology allows the selection criteria to be adjusted after applicant CVs have been parsed and scored. If skills are re-weighted, then the re-scoring of applicants happens dynamically in real time. However, if the skills requirements are completely changed, then candidate CVs are re-scored against the new criteria. Regardless of how the brief is adjusted, recruiters will be able to see the result of their adjustments ensuring complete transparency.
Conclusion
When carefully programmed and focused on evaluating skills, AI has the potential to positively influence recruitment decisions by minimising bias and promoting a more inclusive hiring process. However, it is imperative to approach the deployment of AI in recruitment with caution, ensuring that these systems are designed, trained, and monitored to uphold fairness, transparency, and ethical standards.