A California federal judge is leaning toward finding Anthropic PBC violated copyright law when it made initial copies of pirated books, but that its subsequent uses to train their generative AI models qualify as fair use.
“I’m inclined to say they did violate the Copyright Act but the subsequent uses were fair use,” Judge William Alsup said Thursday during a hearing in San Francisco. “That’s kind of the way I’m leaning right now,” he said, but concluded the 90-minute hearing by clarifying that his decision isn’t final. “Sometimes I say that and change my mind.”
The hearing comes three weeks after a summary judgment hearing in a similar case also in the US District Court for the Northern District of California in which authors accused
The first judge to rule will provide a window into how federal courts interpret the fair use argument for training generative artificial intelligence models with copyrighted materials. A decision against Anthropic could disrupt the billion-dollar business model behind many AI companies, which rely on the belief that training with unlicensed copyrighted content doesn’t violate the law.
Anthropic has argued the four factors of the fair use analysis—purpose of the use, nature of the copyrighted work, reasonableness of the use, and market effects—weigh in its favor, calling its use “transformative in the extreme.” The authors counteredthat courts have repeatedly ruled fair use cannot justify downloading copyrighted works from pirate websites to avoid paying fees.
Anthropic’s use of notorious digital piracy websites raised concerns for Alsup. In response, the company’s counsel, Joseph Richard Farris of Arnold & Porter Kaye Scholer LLP, argued the Supreme Court has been skeptical whether bad faith has any effect on the fair use analysis.
Alsup pushed back, saying “I have a hard time seeing that you can commit what is ordinarily a crime, but get exonerated because you end up using it for a transformative use.”
There could be a scenario where the court holds Anthropic should pay for the initial copies it acquired, Alsup said, suggesting the cost could be determined using Amazon book prices. But he noted the relatively low cost of legitimate alternatives, saying he’s found books for $1 at library sales.
“They’re good deals, if you look through them long enough.”
The authors pressed further on the piracy issue. Authors’ counsel, Rohit Dwarka Nath of Susman Godfrey LLP, argued Anthropic’s use of so-called “shadow libraries” weighs against the fair use defense because these illicit networks harm the market for the authors’ books. He pointed to a post by one of the piracy libraries that said shadow libraries were dying before AI came along and large language models began contacting them to train on their materials.
Farris called the argument a “hail mary,” saying it’s inconceivable Anthropic is sustaining shadow libraries because they didn’t pay them.
Alsup also questioned authors’ argument about AI’s effect on the market for books, noting guardrails Anthropic has in place to prevent verbatim outputs.
When Nath said AI can dilute the market by flooding it with cheap knockoffs, Alsup seemed unsure about the harms of dilution, pointing out common similarities in mystery novels involving murder.
He asked the parties to submit additional briefing about a 2015 Second Circuit decision relied on heavily by AI companies, Authors Guild v. Google Inc., and whether the decision said Google’s original copying of books in that case was protected by fair use.
Lieff Cabraser Heimann & Bernstein LLP and Cowan Debaets Abrahams & Sheppard LLP also represent the authors. Anthropic is also represented by Latham & Watkins LLP and Lex Lumina LLP.
The case is Bartz v. Anthropic PBC, N.D. Cal., No. 24-cv-05417, hearing 5/22/25.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Tax or Log In to keep reading:
Learn About Bloomberg Tax
From research to software to news, find what you need to stay ahead.
Already a subscriber?
Log in to keep reading or access research tools.