We previously reported in May of 2022 that the U.S. Equal Employment Opportunity Commission (“EEOC”), in conjunction with the U.S. Department of Justice, issued guidance to employers, employees and applicants on the use of artificial intelligence tools. We also reported in February of this year about the public hearing held by the EEOC in January, which lasted almost four hours, was attended virtually by almost 3,000 members of the public, and had testimony from 12 witnesses, including experts from the American Civil Liberties Union, the U.S. Chamber of Commerce, and the American Association of Retired Persons, as well as witnesses from law firms and universities.
The EEOC has now reached a tentative settlement of its first discrimination suit based on the use of AI software. In the EEOC v. iTutorGroup, Inc., E.D.N.Y., No. 22-cv-02565, the EEOC alleged that iTutorGroup, which provides English-language tutoring services to students in China, was using AI software that automatically rejected applicants based upon their age. The discriminatory practice was discovered after one of the rejected applicants resubmitted their application using a different birth date and was accepted for the position.
The EEOC and iTutorGroup have entered into a consent decree that needs to be approved by Judge Pamela K. Chen. Under the terms of the consent decree, iTutorGroup will provide $365,000 to be distributed to more than 200 applicants who were denied jobs in violation of the Age Discrimination in Employment Act. The Company must invite all applicants previously rejected between March and April 2020 to reapply, and it also must adopt anti-discrimination policies and conduct anti-discrimination training. If approved, the consent decree will remain in effect for a period of five years.
This case highlights the aggressive stance that the EEOC is taking and will continue to take against companies that utilize AI software in their recruiting. Examples of some of the AI tools that concern anti-AI advocates and the EEOC are: “resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test.”
If you have questions about the EEOC’s guidance on and examination of artificial intelligence, or a general labor or employment question, feel free to contact Joel Hlavaty or any member of Frantz Ward’s Labor & Employment Group.