EEOC settles first AI-bias case, but many more are likely
Employment attorneys caution that companies will be on the hook if a vendor's hiring software turns out to perpetuate bias.
An AI-bias-in-hiring lawsuit that the Equal Employment Opportunity Commission settled this week was the first case of its kind brought by the agency. But employment lawyers expect many more to come, as well as more suits filed by job candidates who believe they were victims of AI bias.
A consent decree filed in the U.S. District Court for the Eastern District of New York on Wednesday says China-based iTutorGroup will pay $365,000 to more than 200 job candidates who were automatically screened out as too old by iTutor’s software.
iTutor, which is based in China, hires tutors in the U.S. to teach English to Chinese-speaking students in this country. According to the EEOC, iTutor’s hiring software automatically eliminated from contention female candidates over 55 and male candidates over 60.
The EEOC filed the suit in May 2022, the same month it issued guidance to employers highlighting the potential for AI bias in hiring and announcing that it would be on the lookout for violations. Seven months later, it included AI bias in its Strategic Enforcement Plan, which provides a road map for enforcement priorities through 2027, for the first time.
While the iTutor case is notable as the first, employment lawyers say it’s not representative of the sorts of cases the agency or private parties are likely to bring in the future.
That’s because the alleged violation—using date-of-birth screening to ax candidates from consideration—is a clear-cut and blatant violation of the Age Discrimination in Employment Act of 1967.
More often, attorneys say, discrimination will be much more nuanced and often unintentional. As an example, in a column this year for Law.com, Clark Hill attorneys Myriah Jaworski and Vanessa Kelly point to a hiring algorithm for engineer candidates that Amazon scrapped in 2018 after discovering it downgraded women applicants even when their gender was not identified.
The hitch was that the training data covered 10 years, a span when most of the candidates were white males. As a result, the algorithm learned to devalue resumes of candidates who graduated from all-women’s colleges and to penalize the word “women.”
“In short,” Jaworski and Kelly wrote, “a well-intentioned race- or gender-neutral automated tool may not, in fact, be race or gender neutral.”
Employment attorneys say many cases likely will be built on data analyses showing that certain protected classes of job candidates were rejected for openings at a disproportionately high rates.
An example of a case from that playbook is a class action filed in February against HR-software giant Workday zeroing in on disqualification rates for Blacks, disabled and over-40 candidates at companies using its software. Workday denies the allegations and has filed a motion to dismiss.
It’s not just vendors that need to worry, employment attorneys say.
They say he unnerving reality for companies and their legal departments is that, while AI tools can be immensely complicated, ignorance is no defense. Both the EEOC and Federal Trade Commission have repeatedly emphasized that employers are liable for the AI tools they use.
“You can’t duck your responsibilities by using a third party to deploy AI methods and then blaming them for any resulting discriminatory results,” Fisher & Phillips agreed in a client note in May.
As a result, attorneys recommend that employers complete a rigorous screening process before selecting a vendor. That process should delve into whether datasets used in the AI are representative and include members of underrepresented groups and whether the language sets used are multi-cultural and not skewed to a particular gender, race or national origin, Jaworski and Kelly said in their column.
Attorneys say employers also can further reduce the risk of bias by conducting a self-audit, tapping a consulting firm with expertise in that realm to assist.
Fisher Phillips strongly recommends employers bring outside counsel into that process, and not just for their legal expertise.
“Using counsel can help cloak your actions under attorney-client privilege, potentially shielding certain results from discovery,” the law firm said. “This can be especially beneficial if you identify changes that need to be made to improve your process to minimize any unintentional impacts.”