Johnson v. Dunn is the case that proved having an AI policy isn't enough if nobody follows it. In July 2025, a federal judge in Alabama disqualified three Butler Snow attorneys from a case and ordered bar regulators notified after they filed motions with hallucinated citations, despite the firm having written AI policies and an Artificial Intelligence Committee. The court called their conduct "reckless to the extreme."
Background
The underlying case was filed in the Northern District of Alabama. Three attorneys from Butler Snow LLP, a well-regarded national firm with over 350 lawyers, represented a party in the litigation. The firm had invested in AI governance. It had written AI policies. It had an Artificial Intelligence Committee. On paper, Butler Snow was doing everything the legal profession recommended.
Attorney Matthew B. Reeves used ChatGPT to generate citations for motions filed with the court. He inserted the citations without verifying a single one. Attorneys William R. Lunsford and William J. Cranford were also involved in the filings.
The gap between the firm's stated policies and the attorneys' actual conduct became the central issue. Butler Snow had the infrastructure for responsible AI use. These attorneys simply didn't use it.
What Happened
The court identified hallucinated legal citations in the filed motions. Judge Anna Manasco investigated and found that Reeves had used ChatGPT to generate the citations and inserted them directly into the filings without any verification.
Judge Manasco described the conduct as "reckless to the extreme." This wasn't a junior associate making a mistake. These were experienced practitioners at a prominent national firm. The court noted that the attorneys' experience level made their failure to verify even more inexcusable.
The investigation also revealed that Butler Snow's AI policies, which existed on paper, had not prevented the misconduct. The firm had an Artificial Intelligence Committee specifically tasked with governing AI use. None of its guidance stopped these attorneys from filing unverified ChatGPT output with a federal court.
The Ruling
Judge Manasco imposed sanctions that went far beyond monetary penalties. All three attorneys, Lunsford, Reeves, and Cranford, were disqualified from representing the client for the remainder of the case. The court directed that the opinion be published in the Federal Supplement, ensuring it would be widely available in legal databases.
The court also ordered the clerk to notify bar regulators in every state where the three attorneys are licensed. This step exposed the attorneys to potential disciplinary proceedings in multiple jurisdictions simultaneously.
The absence of a monetary fine was deliberate. The court chose sanctions that carried greater professional consequences: removal from the case, published embarrassment, and regulatory notification. For experienced attorneys at a major firm, these consequences are far more damaging than a check to the court registry.
Outcome: Rather than monetary penalties, the court disqualified all three offending attorneys from representing the client for the remainder of the case. The court directed the opinion be published in the Federal Supplement and ordered the clerk to notify bar regulators in every state where the attorneys are licensed.
Why This Case Matters
Johnson v. Dunn is arguably the most significant AI sanctions case since Mata v. Avianca. It demolished the assumption that large, well-resourced firms with formal AI governance are safe from AI sanctions. Butler Snow did what the profession recommended: policies, committees, guidelines. It didn't matter because the individual attorneys didn't follow them.
The disqualification sanction was unprecedented in AI cases. Prior courts had imposed fines, CLE requirements, grievance referrals, and license suspensions. Removing attorneys from a case mid-litigation forces the client to retain new counsel, disrupts the entire proceeding, and signals to the legal market that these attorneys can't be trusted with AI-assisted work.
The bar notification order amplified the consequences exponentially. Instead of facing discipline in one jurisdiction, the three attorneys now face potential investigations in every state where they hold a license. For attorneys at a national firm who are likely barred in multiple states, this creates simultaneous multi-jurisdictional exposure.
Lessons for Attorneys
AI policies are necessary but not sufficient. Butler Snow had written policies and a dedicated AI committee, and it still got sanctioned. The lesson for firm leadership is that policies only work if they're enforced, trained on regularly, and embedded into actual workflows. A document sitting in a policy manual doesn't prevent an attorney from pasting ChatGPT output into a brief at 11 PM.
Experience doesn't protect you. The court specifically noted that these were experienced practitioners, and that their seniority made the failure worse, not better. Senior attorneys sometimes assume their judgment compensates for process shortcuts. This case proves otherwise. AI hallucinations don't care about your years of practice or your firm's reputation.
Firms need to build verification into the technology itself, not just the policy manual. If attorneys can use ChatGPT to generate citations and paste them directly into filings without any checkpoint, the workflow is broken regardless of what the policy says. The verification step needs to be structural: a required sign-off, a mandatory database check, a second-reviewer gate. Something that physically prevents unverified AI content from reaching a filing.
The Bottom Line
Johnson v. Dunn proved that AI policies without enforcement are meaningless. Three experienced Butler Snow attorneys were disqualified and reported to bar regulators despite the firm having written AI governance. The policy existed. The attorneys ignored it.
AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.