Making hiring technology accessible means ensuring both that a candidate can use the technology and that the skills it measures do not unfairly exclude candidates with disabilities, says Alexandra Givens, the CEO of the Center for Democracy and Technology, an organization focused on civil rights in the digital age.
AI-based recruiting tools often don’t include people with disabilities when generating their training data, she says. These people have long been excluded from the workforce, so algorithms modeled on a company’s previous hires will not reflect their potential.
Although the models could take into account outliers, the way a disability presents itself varies considerably from person to person. Two people with autism, for example, can have very different strengths and challenges.
“As we automate these systems and employers look for what’s fastest and most efficient, they lose the chance that people actually show their skills and ability to do the job,” says Givens. “And that’s a huge loss.”
A hands-off approach
Government regulators struggle to monitor AI recruiting tools. In December 2020, 11 senators wrote a letter at United States Equal Employment Opportunity Commission expressing concerns about the use of hiring technologies after the covid-19 pandemic. The letter inquired about the agency’s authority to investigate whether these tools discriminate, particularly against people with disabilities.
The EEOC responded with a letter in January that was leaked to MIT Technology Review. In the letter, the commission said it cannot investigate AI recruiting tools without a specific allegation of discrimination. The letter also highlighted concerns about the industry’s reluctance to share data and said variation between software from different companies would prevent the EEOC from instituting general policies.
“I was surprised and disappointed when I saw the answer,” said Roland behm, lawyer and advocate for people with behavioral health problems. “The whole substance of this letter seemed to make the EEOC a passive spectator rather than a law enforcement agency.”
The agency usually begins an investigation once an individual files a discrimination complaint. With AI hiring technology, however, most applicants don’t know why they were turned down for the job. “I think one of the reasons we haven’t seen more enforcement or private litigation in this area is because applicants don’t know they are being graded or assessed by a computer.” , declares Keith sonderling, an EEOC commissioner.
Sonderling says he believes artificial intelligence will improve the hiring process, and he hopes the agency will release advice to employers on how best to implement it. He says he welcomes the oversight of Congress.