Concord Music Group is suing Anthropic twice — and the combined exposure is existential. Concord I (filed October 2023 in M.D. Tenn.) alleges direct and contributory copyright infringement for Claude generating copyrighted song lyrics on demand. Concord II (filed January 2026 in the same court) ups the ante: Anthropic allegedly secretly torrented 20,517 copyrighted musical compositions to train its models, and Concord wants $3.1 billion in statutory damages.

Combined with the Bartz v. Anthropic piracy ruling — where Judge Alsup found that training on pirated material gets zero fair use protection — Concord II could be the case that defines Anthropic's financial future. The piracy allegations aren't theoretical. Concord claims to have evidence that Anthropic employees used BitTorrent to download copyrighted music files and fed them into training pipelines. If true, Bartz already told us the legal answer: that's not fair use.


Concord I: The Lyrics Reproduction Case

The original lawsuit, filed in October 2023, focuses on Claude's output. When users prompt Claude for song lyrics, it sometimes reproduces copyrighted lyrics verbatim — or close enough to constitute infringement. Concord owns publishing rights to songs by artists including Beyonce, Mariah Carey, and The Weeknd, among thousands of others.

Concord's theory of liability is twofold. Direct infringement: Claude reproduces copyrighted lyrics in its outputs, and Anthropic is liable because it operates the system that generates those outputs. Contributory infringement: Anthropic knows users request copyrighted lyrics, has the ability to prevent reproduction, and fails to adequately do so — making it liable for facilitating its users' infringement.

Anthropic has implemented filters to refuse lyrics requests, but Concord's complaint includes dozens of examples where those filters failed. The case is in active discovery, with trial not yet scheduled. Standing alone, Concord I is a significant but manageable copyright case. Combined with Concord II, it becomes part of an existential threat.

Concord II: The $3.1 Billion Torrenting Allegations

Filed in January 2026, Concord II is a different beast entirely. The complaint alleges that Anthropic employees used BitTorrent protocols to download 20,517 copyrighted musical compositions — complete recordings and published sheet music — and used them as training data for Claude. The alleged statutory damages: $3.1 billion ($150,000 maximum per willful infringement, multiplied across the catalog).

Concord's evidence reportedly includes server logs, torrent metadata, and internal Anthropic documentation linking specific downloaded files to training datasets. The complaint describes a systematic acquisition program, not isolated downloads — employees allegedly maintained torrent clients on company infrastructure and organized downloaded music files for training ingestion.

This isn't a "did the AI memorize some lyrics" case. This is a "did the company pirate an entire music catalog" case. The distinction matters because of Bartz: Judge Alsup already ruled that pirated training data receives no fair use protection. If Concord proves the torrenting allegations, Anthropic has no copyright defense.

How Bartz v. Anthropic Amplifies Concord II

The timing of these cases is devastating for Anthropic. In Bartz v. Anthropic, Judge Alsup established that training on pirated material doesn't qualify for fair use. That ruling was about books. Concord II is about music. But the legal principle is identical: if you pirated the training data, you can't claim fair use on the output.

Concord's attorneys have already cited the Bartz ruling in their Concord II briefing. Their argument: the court doesn't need to conduct a full fair use analysis because Bartz already resolved the threshold question. If Anthropic torrented the compositions (a factual question), then fair use is off the table (a legal conclusion Bartz already reached). The only remaining question is damages.

Anthropic can't argue that music training data is different from book training data — the Bartz ruling applies to any pirated copyrighted material. The company's best defense is to dispute the factual allegations: argue the torrenting didn't happen, or that the downloaded files weren't actually used in training. But Concord claims to have the server logs.

Combined Exposure and Anthropic's Financial Position

Add up the numbers: $1.5 billion from the Bartz settlement, $3.1 billion sought in Concord II, plus unquantified damages in Concord I and other pending copyright cases. Anthropic raised $7.3 billion in its latest funding round at a $61.5 billion valuation — but that valuation assumed these cases were manageable risks, not existential ones.

If Concord II goes to trial and Concord wins anything close to its $3.1 billion ask, Anthropic faces a financial crisis. The company burns through cash rapidly — estimated at $2-3 billion annually on compute alone. A multi-billion dollar judgment on top of the Bartz settlement could force Anthropic to seek emergency funding, sell itself, or restructure.

Anthropic's investors — including Google, Salesforce, and Amazon — are reportedly pressuring the company to settle both Concord cases before trial. A settlement would likely cost less than the full $3.1 billion but would still represent a massive financial hit and an admission that the training data acquisition program was legally indefensible.

Concord's lawsuits establish that music publishers will aggressively litigate AI training data claims — and they have the resources to do it. Concord is backed by a consortium of major music publishers and has retained top-tier IP litigation firms. The RIAA has filed amicus briefs supporting Concord's position in both cases.

For AI companies, the lesson is clear: music training data is the highest-risk category. Music publishers are organized, well-funded, and experienced at large-scale IP litigation (they've been suing for decades, from Napster to Spotify). Any AI company that trained on unlicensed music is a target.

For lawyers advising AI companies, Concord creates an immediate action item: audit your client's music training data. If they used copyrighted music — downloaded, scraped, or otherwise acquired without licenses — the liability exposure is calculated per composition at up to $150,000 each. At scale, that's billions. The AI case law trajectory points toward a world where unlicensed training data is the single biggest legal risk in AI.

The Bottom Line: Two Concord lawsuits plus the Bartz piracy ruling create existential risk for Anthropic — $3.1 billion in alleged damages for torrenting 20,517 compositions, with no fair use defense available for pirated training data.

AI-Assisted Research. This piece was researched and written with AI assistance, reviewed and edited by Manu Ayala. For deeper takes and the perspective behind the research, follow me on LinkedIn or email me directly.