Applicant screening, video interview analysis, resume filtering and ranking, candidate sourcing, game-based cognitive and emotional assessments and predictive analytics are just some of the artificial intelligence functions businesses are increasingly relying on for hiring and employment practices.
Roughly 83% of employers, including 99% of Fortune 500 companies, use some form of automated tools as part of their hiring process, according to an estimate given by the U.S. Equal Employment Opportunity Commission in early 2023. Kelly Gust, president and principal talent consultant for HR Full Circle Consulting, a management consulting firm specializing in human resources and people management, is personally familiar with AI's use in hiring and employment practices. "People have been using AI tools in HR for a long time," said Gust. "Skills-based assessments and other tests help make decision making easier. Resume searches, candidate database searches, Indeed, LinkedIn, ZipRecruite – they're all using AI. There are algorithms in the background matching job requirements and candidates and companies. Something as simple as if you've posted a job on LinkedIn and had LinkedIn suggest candidates to you – that's artificial intelligence."
One increasingly common tool is AI-driven video interviews that analyze candidates' responses, facial expressions, tone of voice and body language. The technology typically assesses communication skills and provides insights into whether the candidate is a good fit for a company's culture and role.
In 2019, Illinois lawmakers unanimously passed the Artificial Intelligence Video Interview Act, which went into effect in 2020. The first-of-its-kind law requires employers to inform candidates that AI is being used to assess their suitability, how AI works and which characteristics will be used in the evaluation. The technology typically assesses communication skills and provides insights into whether the candidate is a good fit for a company's culture and role.
However, if you are using these tools to find good culture fits, Gust cautions that a healthy dose of skepticism is warranted.
"I feel like people use culture fit as a blanket umbrella to disqualify candidates for a variety of reasons, and unfortunately, that can lead to implicit bias and adverse impact in a lot of cases," said Gust. "If you're going to use automated scoring for something as ambiguous as 'they're not a culture fit,' you better be able to define what that culture is and everybody in your organization should be able to define that very clearly and succinctly. Otherwise, what are you actually assessing? How do you know?"
Artificial intelligence is also being used to speed up the hiring process, with AI tools that scan resumes for keywords and qualifications, significantly reducing the time recruiters spend on initial screening. This allows HR professionals time to focus on the candidates that AI deems most promising, while job seekers filtered out by the technology may never have their application reviewed by a real person.
While some have touted the potential benefits of AI for mitigating bias in the recruiting process, the technology is far from perfect. Last year, the EEOC settled its first lawsuit concerning AI bias in hiring. In that case, a user who'd been screened out submitted the same application with a younger birthdate and was able to land an interview. At another company, applicants whose resumes mentioned baseball got higher marks, while resumes mentioning softball were treated negatively – demonstrating an indirect bias regarding gender.
A recent legislative development currently awaiting the governor's signature, Illinois HB3773, emphasizes the responsible use of AI in hiring. The bill aims to ensure that AI tools used in employment do not discriminate against protected characteristics. According to a press release from the bill's sponsor, state Sen. Javier Cervantes, D-Chicago, "any employer who uses artificial intelligence in a prohibited manner or fails to notify an employee of the employer's use of artificial intelligence would be in violation of the Illinois Human Rights Act."
Concerns around AI bias are exactly why Gust says it's imperative that employers be skeptical of these tools: "If I'm hiring a custodian, how much does that custodian need to demonstrate oral and written communication skills and use all of the words that are in my mission statement and be a culture fit? Maybe they do, maybe they don't," Gust said. "Sometimes we look for things that are not actually related to the job we're asking the person to do. That can be a problem."
Gust noted that it's important to look at uniform guidelines for employee selection and the statistical rigor that an assessment tool should be able to demonstrate to show that it actually predicts what it says it's going to predict, like job performance or culture fit.
"You need to show any AI tool does what it says it's going to do, measures what it purports to measure, and that it is free from biased discrimination and adverse impact," Gust said.
According to Gust, this requires some due diligence: "Do your research. Ask to see the technical report. Ask to bring in someone who knows to ask the right questions: Does this tool actually predict job performance? In my organization, in this job, is the tool reliable, fair and unbiased? How does this tool fit in my process?"
Annie Fulgenzi is a second-year law student at University of Illinois who is particularly interested in issues related to artificial intelligence. She previously interned at Springfield Business Journal and Illinois Times while completing her undergraduate degree at SIUE.