In Mobley v. Workday, Inc. (N.D. Cal., May 2025), Judge Rita Lin certified a class of job applicants who allege that Workday's AI hiring tools systematically discriminated against older workers. During discovery, Workday disclosed a staggering number: its AI screening tools processed 1.1 billion rejected applications across its client base. That single data point made this the largest employment discrimination class action in history by volume.

This isn't a case against an employer — it's a case against the AI vendor that built the hiring algorithm. That distinction makes Mobley a template for a new category of litigation. Every company selling AI decision-making tools to employers is now a potential defendant, and plaintiffs' attorneys have a roadmap for how to get class certification.


Class Certification and the 1.1 Billion Rejected Applications

Judge Lin certified the class under Rule 23(b)(2) for injunctive relief and Rule 23(b)(3) for damages. The key finding: because Workday's AI tool applies the same algorithmic model to every applicant processed through its system, the commonality and typicality requirements were easily satisfied. Every class member was screened by the same code.

The 1.1 billion rejection figure came from Workday's own discovery responses. The company's AI hiring tools are used by over 10,000 employers worldwide, processing millions of applications daily. Workday argued that each employer configures the tool differently, so individual issues predominate. Judge Lin rejected that argument — the underlying model architecture and training data are consistent across deployments.

Plaintiffs' expert demonstrated that the model's rejection rates for applicants over 40 were statistically significantly higher than for younger applicants, even after controlling for qualifications. The court found this sufficient to establish commonality at the certification stage.

Why Suing the AI Vendor Changes Everything

Before Mobley, AI hiring discrimination claims targeted individual employers. That's a slow, expensive, one-company-at-a-time approach. Mobley went upstream — suing Workday as the vendor that built and sold the discriminatory tool. If the algorithm is biased, every employer using it is deploying a biased system. One lawsuit covers them all.

The legal theory is straightforward. Under the Age Discrimination in Employment Act (ADEA), it's unlawful to "limit, segregate, or classify" applicants in ways that deprive them of employment opportunities because of age. Plaintiffs argue Workday's AI does exactly that — and because Workday controls the model, Workday is liable as an "agent" of the employers who deploy it.

Judge Lin allowed the ADEA claims to proceed, finding that Workday's role goes beyond a passive software provider. Workday trains the model, updates the model, and markets it as a decision-making tool. That level of control supports vendor liability. This theory will be replicated against every AI hiring platform in the market.

The ADEA Claims and Disparate Impact Theory

The plaintiffs are proceeding under a disparate impact theory — they don't have to prove Workday intended to discriminate. They have to prove the AI tool produces outcomes that disproportionately harm older workers, and that Workday can't justify those outcomes as business necessities.

The statistical evidence is damning. Plaintiffs' expert analyzed a sample of 2.3 million applications processed through Workday's system and found that applicants over 40 were rejected at rates 23% higher than applicants under 40 with comparable qualifications. The model appears to penalize resume features correlated with age — career gaps, older graduation dates, experience with legacy technologies — even though age itself isn't an explicit input.

Workday's defense is the "business necessity" exception: the model selects for job-relevant qualities, and any age-correlated impact is incidental. But at the certification stage, that defense is a merits question the court deferred. The class is certified, and Workday has to litigate the bias question with 1.1 billion data points on the table.

Template for AI Vendor Liability Litigation

Mobley gives plaintiffs' attorneys a replicable playbook. Step one: identify an AI vendor whose tool makes consequential decisions (hiring, lending, insurance, housing). Step two: obtain statistical evidence of disparate impact through discovery. Step three: argue vendor liability based on the vendor's control over model training and deployment. Step four: seek class certification based on the algorithm's uniform application.

The AI vendors most exposed are those selling decision-making tools — not just analytics or recommendations, but systems that produce accept/reject outputs that employers follow without independent review. HireVue, Pymetrics (now Harver), and similar platforms face identical legal theories.

For companies buying AI hiring tools, Mobley creates a due diligence obligation. If your vendor's algorithm is later found discriminatory, you're exposed too — both as a Workday class member's employer and potentially as a direct defendant. Demand bias audits, algorithmic impact assessments, and contractual indemnification before deploying any AI screening tool.

What Happens Next and Why It Matters

With class certification granted, Mobley moves to merits discovery and potentially trial. Workday's options are limited: settle (expensive, given the class size), win on the merits (difficult, given the statistical evidence), or get certification reversed on interlocutory appeal (unlikely, given Judge Lin's thorough opinion).

A settlement could reach nine figures given the class size and the per-applicant statutory damages available under the ADEA. Even a modest per-application recovery multiplied across 1.1 billion rejections produces astronomical numbers. Workday's market cap took a 12% hit the week class certification was announced.

The broader impact extends beyond employment. If AI vendors are liable for the discriminatory outputs of their tools, the same theory applies to AI systems used in lending decisions, insurance underwriting, tenant screening, and criminal risk assessment. Mobley is the first domino. The AI case law landscape is shifting from copyright to civil rights.

The Bottom Line: Mobley v. Workday is the first class action holding an AI vendor — not the employer — liable for algorithmic hiring discrimination, and its 1.1 billion rejected applications make it the template for every AI bias case that follows.

AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.