Landmark Lawsuits
Legal challenges that are reshaping AI hiring accountability
Derek Mobley, a Black man over 40 with anxiety and depression, applied to over 100 jobs through companies using Workday's AI screening tools. He was rejected every time - often within minutes, including one rejection at 1:50 AM, less than an hour after submitting his application. The lawsuit alleges Workday's algorithms discriminate based on race, age, and disability.
Key Developments
- July 2024: Court ruled Workday can be held liable as an "agent" of employers - setting precedent that AI vendors can face direct discrimination liability
- May 2025: Case certified as nationwide collective action under ADEA (Age Discrimination in Employment Act)
- July 2025: Scope expanded to include HiredScore AI features
- Potential class size: Workday disclosed 1.1 billion applications were processed - the collective could include "hundreds of millions" of members
Why this matters: This is the first major court ruling that AI vendors - not just employers - can be held directly liable for algorithmic discrimination. It opens the door for job seekers to sue the software companies, not just the hiring companies.
The EEOC's first-ever AI discrimination case. iTutorGroup programmed its hiring software to automatically reject female applicants age 55+ and male applicants age 60+. The discrimination was discovered when an applicant submitted two identical applications with different birth dates - only the application with the younger date received an interview invitation.
Settlement Terms
- $365,000 payment to over 200 rejected applicants
- Mandatory anti-discrimination policies and training
- All rejected applicants invited to reapply
- Company must stop requesting birth dates from applicants
Why this matters: The EEOC proved that "the algorithm did it" is not a defense. Even relatively simple age-based filters violate federal law. This case signals the EEOC treats algorithmic discrimination as a top enforcement priority.
D.K., a deaf Indigenous woman, worked for Intuit since 2019 with positive reviews and bonuses. When she applied for a promotion in 2024, Intuit required her to complete an AI video interview through HireVue. She requested human-generated captioning as an accommodation. Intuit denied the request. The automated subtitles failed - portions had no captions at all. The AI gave her low scores and recommended she "practice active listening." She didn't get the promotion.
Allegations
- HireVue's speech recognition performs worse for deaf applicants and non-white English dialects
- Failure to provide reasonable accommodation under ADA
- Race discrimination under Title VII (Native American)
- Violations of Colorado Anti-Discrimination Act
A job applicant alleged CVS broke Massachusetts law by requiring applicants to take HireVue video interviews that analyzed facial expressions (smiles, smirks) and assigned "employability scores" measuring "conscientiousness and responsibility" and "innate sense of integrity and honor" - essentially functioning as an illegal lie detector test.
Key Issues
- AI video analysis of facial expressions used to assess "integrity"
- Violated Massachusetts polygraph/lie detector laws
- Settlement terms not disclosed
The Research
Peer-reviewed studies documenting AI hiring bias
AI resume screening tools preferred white-associated names 85% of the time vs. Black-associated names just 9% of the time.
In head-to-head comparisons, Black male-associated names were never preferred over white male-associated names. Zero percent.
An estimated 99% of Fortune 500 companies now use some form of AI-powered screening in their hiring process.
42% of employers using AI hiring tools admit they're aware of potential bias - but still prioritize efficiency over fairness.
Roughly 7 in 10 companies allow AI tools to reject candidates without any human ever reviewing the decision.
Female-associated names received preference in just 11% of AI screening tests, compared to 52% for male names.
Your Legal Rights
Laws that protect you when AI is used in hiring decisions
If you're applying for jobs in New York City, employers must disclose AI screening, publish bias audits, and offer alternatives.
10-Day Notice
Must be told at least 10 business days before AI is used to evaluate you
Bias Audit
Annual third-party audit required; results must be published on company website
Alternative Process
You can request an alternative selection process or accommodations
Penalties
$500-$1,500 per violation; $10,000+ per week of continued violation
File a complaint: Report violations to NYC Dept. of Consumer and Worker Protection at nyc.gov/dca or call 311.
If you do a video interview in Illinois, employers must disclose AI analysis and get your consent.
Disclosure
Must be told AI is analyzing your interview before it happens
Explanation
Must explain how the AI works and what it's evaluating
Consent
Must get your written consent before using AI
Deletion
Must delete the video within 30 days upon your request
Colorado is implementing comprehensive AI regulations requiring "reasonable care" to prevent algorithmic discrimination.
Reasonable Care
Companies must take steps to prevent algorithmic discrimination
Notification
Must be notified when AI makes adverse decisions about you
Explanation
Right to explanation of how AI made the decision
Appeal
Right to human review and appeal process
The EU classifies hiring AI as "high-risk" with strict requirements. Affects EU companies and those hiring in Europe.
Human Oversight
Humans must have authority to intervene and override AI decisions
Transparency
Must inform candidates when AI is making decisions about them
Bias Testing
Mandatory testing for discrimination before deployment
Penalties
Up to €35 million or 7% of global revenue
Take Action
File an EEOC Complaint
If you believe you were discriminated against by AI hiring tools based on race, age, disability, or other protected characteristics.
File Online at EEOC.gov →Report NYC LL144 Violations
If a NYC employer didn't notify you about AI screening or doesn't have a published bias audit.
Report to DCWP →Join the Workday Collective Action
If you're 40+ and were rejected through Workday's platform since September 2020, you may be eligible to join.
Check Eligibility →Share Your Story
Document your experience. Every story adds to the evidence of systemic discrimination.
Share Your Story →Email Templates
Request human review of your application