A robotic hand on the left picks up a line of cubes. Each cube has a computerized human figure, symbolizing that AI is picking the "right" candidate from the application pool. — Photo by 4CornerResources.com

Using computer programs to screen job candidates’ resumes is not a new HR practice among big corporates. AI accelerates the automation process for all companies in the selection process. More companies are using AI to evaluate job candidates’ performance in their initial interviews, such as word choice and cognitive abilities, in addition to screening their resumes.

AI speeds up the hiring process

AI can help companies speed up the hiring process because they usually receive hundreds or even thousands of job applications for a vacancy. It becomes unrealistic for companies to rely solely on “human eyes” to screen all resumes or perform as many live job interviews as AI would do.

Research and industry reports have widely documented HR managers’ biases in their hiring practices. Now, when AI is helping HR managers make decisions, do companies still need to be mindful of potential biases during the hiring process?

Understanding AI biases in hiring

Firms providing such AI screening services would usually claim their algorithms would eliminate human biases because machines tend to focus on other factors of the job candidates than their demographics. Still, AI is not perfect or free of biases, including

  • Data bias: Because AI relies on historical data to train its predictive model, it could generate bias recommendations if the training datasets include biased information, such as favoring specific schools or gender.
  • Algorithmic bias: When AI is trained to favor specific job titles or experiences that are more common in certain demographics, other groups might be screened out even if they are also qualified.
  • Unfair assessment: This could happen to candidates who took time off to take care of their family members, including a maternity leave. AI might not be able to understand the context when spotting a gap between employment.
  • Feedback: If any of the above biases are not corrected immediately, the AI tool will only become more biased in selecting candidates in the future.

Methods to avoid AI biases

To mitigate AI biases in the hiring process, companies must first use a diverse dataset to train the AI tools. Companies should keep the above potential biases in mind and reinforce ethical guidelines to oversee the automation process. Designing an inclusive AI will help. Lastly, companies must regularly audit AI tools and hiring processes with real human managers.

Get ready for an EEOC audit

It is unclear when or how EEPC would start auditing companies’ AI hiring practices. However, maintaining all records and being transparent in the use of unbiased algorithms can be helpful. Transparency is key to building trust and ensuring the fairness of the hiring process.

What do those AI biases mean to job candidates?

AI has created a game-changing effect on how job seekers can secure a job offer. It deserves a more thorough discussion. Stay tuned for my next viewpoint on this topic.

Are you more excited or nervous about seeing more companies using AI in screening candidates? What suggestions will you make to companies already using AI in hiring?

View source