Mobley v. Workday changed everything. When a federal court ruled in January 2024 that an employer could be liable under Title VII for an AI vendor's biased screening algorithm, it didn't just create a new cause of action — it blew open an entire practice area. Employment lawyers who understand AI aren't just advising clients anymore. They're building the playbook for the next decade of discrimination litigation.

The EEOC has made algorithmic discrimination an enforcement priority through 2026, and plaintiff firms are already filing cases faster than defense counsel can staff them. Whether you're advising employers on compliant AI hiring tools or suing companies that let biased algorithms make employment decisions, AI literacy isn't optional — it's the core competency.


Mobley v. Workday: The Case That Rewrote AI Employment Law

In *Mobley v. Workday, Inc.* (N.D. Cal. 2024), the court held that Workday could be treated as an employment agency or agent of the employer under Title VII, the ADEA, and the ADA — even though Workday was a third-party software vendor. The plaintiff alleged that Workday's AI screening tools systematically rejected his applications based on race, age, and disability.

The ruling's impact is massive. It means AI vendors aren't shielded from discrimination claims just because they're not the employer. It creates joint liability exposure for every company using third-party AI hiring tools. And it puts the burden on employers to audit and validate the AI tools they're using — or face liability for the vendor's algorithmic bias.

For employment lawyers, this is a goldmine. Plaintiff firms can now name both the employer and the AI vendor. Defense counsel need to audit every AI tool in their client's HR stack. And the vendors themselves need employment counsel who understand both the technology and the discrimination framework.

EEOC Guidance: The Regulatory Framework Taking Shape

The EEOC's 2023 guidance on AI and algorithmic discrimination laid out a clear framework: if an AI tool causes disparate impact, the employer is liable — period. The agency doesn't care whether the employer built the algorithm or bought it off the shelf. The four-fifths rule applies to AI screening just like it applies to human decision-making.

The EEOC's Strategic Enforcement Plan (2024-2028) lists AI and algorithmic fairness as a top priority. They're actively investigating complaints involving AI hiring tools, automated performance evaluations, and algorithmic termination decisions. Commissioners have publicly stated they expect a wave of enforcement actions targeting companies using unvalidated AI in employment decisions.

For managing partners: if your employment practice isn't advising clients on AI tool audits, disparate impact testing, and vendor contract provisions for algorithmic accountability, you're leaving revenue on the table and leaving clients exposed.

AI Hiring Tools: Auditing for Compliance Before the Lawsuit Hits

New York City's Local Law 144 requires annual bias audits for any automated employment decision tool used in hiring or promotion. Illinois and Maryland have AI-specific employment laws. Colorado's AI Act (effective 2026) creates affirmative obligations for "high-risk" AI systems in employment — and it has teeth.

The compliance work is substantial and recurring. Employers using AI hiring tools need adverse impact analyses, validation studies, reasonable accommodation protocols for AI-screened candidates, and documentation of human oversight. Every one of these requirements is billable work for employment counsel.

Smart firms are building AI audit practices now. The engagement model: conduct an initial audit of the client's AI tools ($15K-$50K), then provide ongoing monitoring and annual recertification. It's annuity revenue in a practice area where most work is episodic litigation.

Using AI in Your Own Employment Practice

Employment litigation generates massive document volumes — years of emails, performance reviews, HR files, and communications. AI document review tools cut through this volume at a fraction of traditional review costs. Firms handling wage-and-hour class actions report 60-70% reduction in document review time using AI-assisted classification.

For plaintiff firms, AI tools can analyze patterns across multiple EEOC charges to identify systemic discrimination that individual complaints might miss. Feed an AI tool 500 charge files and it will surface patterns in termination timing, performance rating distributions, and demographic correlations that would take a paralegal months to compile.

On the defense side, AI can review an employer's entire decision-making history to build the legitimate-business-reason defense. Automated analysis of promotion patterns, compensation data, and performance metrics can either confirm your client's position or give you early warning that settlement is the smart play.

Algorithmic Discrimination Litigation: Building the Plaintiff's Case

Plaintiff firms are developing a new litigation playbook for algorithmic discrimination cases. The key challenges are proving the AI tool caused the adverse action, establishing disparate impact without access to the algorithm's training data, and overcoming the employer's defense that a human made the final decision.

The discovery battles are fierce. Employers and vendors claim the algorithm is a trade secret. Plaintiffs argue that disparate impact testing requires access to the model's inputs, outputs, and training data. Courts are still working out the framework, but early rulings suggest that statistical output analysis may be sufficient — you don't need to reverse-engineer the algorithm if you can show the results are discriminatory.

Expert witnesses are critical. You need someone who can explain to a jury how a neural network perpetuates historical bias in hiring data. The firms building relationships with AI fairness researchers and data scientists now will have a massive advantage when these cases go to trial.

The Bottom Line: AI employment law isn't a niche — it's the future of employment litigation. Between Mobley v. Workday, EEOC enforcement priorities, and state AI hiring laws, every employment practice needs AI competency. Plaintiff firms have new theories of liability and new defendants to name. Defense firms have new compliance revenue streams. And the firms that build AI audit practices now will own a recurring revenue model that grows with every new regulation.

AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.