©2025 Seyfarth Shaw LLP EEOC-INITIATED LITIGATION: 2025 EDITION | 30 29 | EEOC-INITIATED LITIGATION: 2025 EDITION ©2025 Seyfarth Shaw LLP it is interested in employers’ use of artificial intelligence and automated systems in that regard. The Commission has emphasized its intent to investigate whether protected groups might be harmed – whether intentionally or not – by automated systems used to target job advertisements to particular populations, recruit workers, or aid in hiring decisions. Specifically, in the SEP, the EEOC has committed to focusing on the use of technology, AI, and machine learning used in job advertisements, recruiting, and hiring decisions. The SEP emphasizes an employer’s use of all technology (not just “automated systems”) in hiring and recruitment as an area strategic focus. The EEOC has, historically, focused on recruiting and hiring in part because private plaintiffs’ counsel have been unwilling to champion large scale hiring cases due to cost and challenges identifying potential “victims.” The proliferation in recent years of electronic tools available to assist employers to find talent in challenging labor markets may provide fertile ground for the EEOC on this issue. Since the SEP was published, EEOC leaders have participated in executive branch AI initiatives,70 and all EEOC personnel were requested by EEOC Chair Charlotte Burrows to attend an AI training about how front-line staff could “identify AI-related issues in [their] enforcement work”. 71 An October 30, 2023 Executive Order set in motion action from multiple Departments and independent agencies intended to both harness the benefits of AI and maintain American leadership in innovation, while addressing risks associated with the use of AI.72 The Executive Order made clear that agencies charged with enforcing civil rights laws should make “comprehensive use of their respective authorities” to address potential civil-rights harms arising from the use of AI, including “issues related to AI and algorithmic discrimination”. On June 3, 2024, the EEOC announced the appointment of Sivaram Ghorakavi as Deputy Chief Information Officer and Chief Artificial Intelligence Officer.73 Ghorakavi’s appointment was made in accordance with the President’s Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, which requires all federal agencies to establish an officer for the use of artificial intelligence. Ghorakavi is charged with coordinating intra-departmental and cross-agency efforts on AI and adjacent issues, in line with the EEOC’s efforts to prevent and remedy employment discrimination through the use of technology a priority in its AI and Algorithmic Fairness initiative and strategic enforcement plan. To date, the EEOC has not initiated litigation of its own to challenge an employer’s use of artificial intelligence in the workplace. However, the EEOC did step into the fray by filing an amicus brief in support of claims by a private plaintiff in Mobley v. Workday.74 In Mobley, the plaintiff alleges that Workday engaged in a pattern or practice of discriminatory job screening that disproportionately disqualifies African-Americans, individuals over 40, and individuals with disabilities from securing employment in violation of Title VII, Section 1981, the ADEA, the ADA, and California state law. Specifically, plaintiff alleges that Workday’s AI and algorithms are more likely to deny job applicants who are African-American, over 40, and have a disability, and plaintiff asserted that Workday acted as an employment agency, or, in the alternative, as an indirect employer or agent of the employer. Mobley seeks class certification. 70 https://www.whitehouse.gov/ostp/ai-bill-of-rights/ 71 https://news.bloomberglaw.com/daily-labor-report/eeoc-to-train-staff-on-ai-based-bias-as-enforcement-efforts-grow 72 https://www.seyfarth.com/news-insights/president-biden-signs-executive-order-setting-forth-broad-directives-for-artificial-intelligence-regulationand-enforcement.html 73 See U.S. Equal Employment Opportunity Commission, EEOC Appoints Sivaram Ghorakavi as Deputy Chief Information Officer and Chief Artificial Intelligence Officer (Jun. 3, 2024), https://www.eeoc.gov/newsroom/eeoc-appoints-sivaram-ghorakavi-deputy-chief-information-officer-and-chiefartificial 74 See Rachel See, Annette Tyman, and Samantha Brooks, EEOC Argues Vendors Using Artificial Intelligence Tools Are Subject to Title VII, the ADA and ADEA Under Novel Theories in Workday Litigation, Workplace Class Action Blog (Apr. 29, 2024), https://www.seyfarth.com/news-insights/legalupdate-eeoc-argues-vendors-using-artificial-intelligence-tools-are-subject-to-title-vii-the-ada-and-adea-under-novel-theories-in-workday-litigation. html. The district court granted Workday’s motion to dismiss Mobley’s original complaint but granted the plaintiff leave to amend his complaint. Workday again moved to dismiss. On this second round of briefing, the EEOC submitted an amicus brief taking a novel position in support of Mobley’s classaction theory that an AI vendor could be directly liable under Title VII, the ADA, or the ADEA for employment discrimination caused by the use of the vendor’s AI. Among other things, the EEOC argued that by actively making automated decisions to reject or advance candidates before referring them to employers, Workday functions as an employment agency under the law. In its amicus submission, the EEOC drew an analogy to IRS rules stating that tax preparation software can be considered a tax preparer if the software does more than just provide “mere mechanical assistance.” So too with algorithmic tools that go beyond that threshold in the employment context, according to the EEOC. The district court subsequently issued a split decision that allowed the plaintiff’s agency theory to proceed, as supported by the EEOC in its amicus brief. 75 The Court’s opinion emphasized the importance of the “agency” theory in addressing potential enforcement gaps in our anti-discrimination laws. In this regard, the Court illustrated the potential gaps with a hypothetical scenario: a software vendor intentionally creates a tool that automatically screens out applicants from historically black colleges and universities, unbeknownst to the employers using the software. Without the agency theory, the Court opined, no party could be held liable for this intentional discrimination. By construing federal anti-discrimination laws broadly and adapting traditional legal concepts to the evolving relationship between AI service providers and employers, the Court’s decision was based, in part, on the desire to avoid potential loopholes in liability. By allowing the plaintiff’s agency theory to proceed, as supported by the EEOC in its amicus brief, the ruling opens the door for a significant expansion of liability for AI vendors in the hiring process, with potential far-reaching implications for both AI service providers and for employers using those tools. Employers, HR vendors and service providers, and AI developers should take note because even though the allegations in the amended complaint may not reflect the reality of how most employers are using AI in hiring, the EEOC’s position is likely to embolden plaintiffs to pursue similar claims. Against this backdrop, employers should be mindful that any charge of discrimination filed with the EEOC that mentions the use of artificial intelligence – or any other technology in hiring – will not only qualify for priority handling, but is also likely to receive additional scrutiny from EEOC management. Likewise, the SEP is used to inform the EEOC’s selection of litigation, so it would not be surprising to see EEOC litigators mining for AI cases to develop and bring. 2 Other Technology in Hiring and the Path Ahead on AI Importantly, the EEOC signaled that it is focusing on all uses of technology in recruitment and hiring, not just artificial intelligence. In August 2023, the EEOC entered into a settlement agreement with iTutorGroup, which many media reports and commenters characterized as the EEOC’s “first ever” case involving artificial intelligence discrimination in hiring. However, according to the EEOC’s complaint, the underlying hiring technology simply asked job applicants for their date of birth and was configured to automatically reject female applicants age 55 or older and male applicants age 60 or older. To be clear, automatically rejecting older job applicants, when their birthdates are already known, does not require any sort of artificial intelligence or machine learning.76 75 See Rachel See and Annette Tyman, Mobley v. Workday: Court Holds AI Service Providers Could Be Directly Liable for Employment Discrimination Under “Agent” Theory, Workplace Class Action Blog (Jul. 19, 2024), https://www.seyfarth.com/news-insights/mobley-v-workday-court-holds-aiservice-providers-could-be-directly-liable-for-employment-discrimination-under-agent-theory.html. 76 https://www.seyfarth.com/news-insights/eeocs-settlement-challenging-simple-algorithm-provides-warning-for-employers-using-artificialintelligence.html
RkJQdWJsaXNoZXIy OTkwMTQ4