tradingkey.logo

Anthropic expert accused of using AI-fabricated source in copyright case

ReutersMay 13, 2025 9:37 PM

By Blake Brittain

- A federal judge in San Jose, California, on Tuesday ordered artificial intelligence company Anthropic to respond to allegations that it submitted a court filing containing a "hallucination" created by AI as part of its defense against copyright claims by a group of music publishers.

A lawyer representing Universal Music Group UMG.AS, Concord and ABKCO in a lawsuit over Anthropic's alleged misuse of their lyrics to train its chatbot Claude told U.S. Magistrate Judge Susan van Keulen at a hearing that an Anthropic data scientist cited a nonexistent academic article to bolster the company's argument in a dispute over evidence.

Van Keulen asked Anthropic to respond by Thursday to the accusation, which the company said appeared to be an inadvertent citation error. He rejected the music companies' request to immediately question the expert but said the allegation presented "a very serious and grave issue," and that there was "a world of difference between a missed citation and a hallucination generated by AI."

Attorneys and spokespeople for Anthropic did not immediately respond to a request for comment following the hearing.

The music publishers' lawsuit is one of several high-stakes disputes between copyright owners against tech companies over the alleged misuse of their work to train artificial-intelligence systems.

The expert's filing cited an article from the journal American Statistician to argue for specific parameters for determining how often Claude reproduces copyrighted song lyrics, which Anthropic calls a "rare event."

The music companies' attorney, Matt Oppenheim of Oppenheim + Zebrak, said during the hearing that he confirmed with one of the supposed authors and the journal itself that the article did not exist. He called the citation a "complete fabrication."

Oppenheim said he did not presume the expert, Olivia Chen, intentionally fabricated the citation, "but we do believe it is likely that Ms. Chen used Anthropic's AI tool Claude to develop her argument and authority to support it."

Chen could not immediately be reached for comment following the hearing.

Anthropic attorney Sy Damle of Latham & Watkins complained at the hearing that the plaintiffs were "sandbagging" them by not raising the accusation earlier. He said the citation was incorrect but appeared to refer to the correct article.

The relevant link in the filing directs to a separate American Statistician article with a different title and different authors.

"Clearly, there was something that was a mis-citation, and that's what we believe right now," Damle said.

Several attorneys have been criticized or sanctioned by courts in recent months for mistakenly citing nonexistent cases and other incorrect information "hallucinated" by AI in their filings.

The case is Concord Music Group Inc v. Anthropic PBC, U.S. District Court for the Northern District of California, No. 3:24-cv-03811.

For the music publishers: Matt Oppenheim of Oppenheim + Zebrak

For Anthropic: Sy Damle of Latham & Watkins

Read more:

Music publishers sue AI company Anthropic over song lyrics

Anthropic wins early round in music publishers' AI copyright case

Disclaimer: The information provided on this website is for educational and informational purposes only and should not be considered financial or investment advice.

Related Articles