Artificial intelligence company Anthropic has been hit with a class-action lawsuit in a California federal court by three authors who claim the company misused their books and hundreds of thousands of others to train its AI-powered chatbot Claude, which generates text in response to user prompts.
The complaint, filed Monday by writers and journalists Andrea Bartz, Charles Graeber and Kirk Wallace Johnson, said Anthropic used pirated versions of their works and others to teach Claude to respond to human cues.
“Anthropic presents itself as a public benefit enterprise, designed to improve humanity. However, for the holders of copyrighted works, Anthropic has already caused massive destruction,” the complaint reads. “It is no exaggeration to say that Anthropic’s model seeks to profit from the exploitation of the human expression and ingenuity behind each of those works.”
The suit adds to several other high-stakes complaints filed by copyright holders, including visual artists, media outlets and record labels, over material used by tech companies to train their generative artificial intelligence systems.
Several groups of authors have sued OpenAI and Meta Platforms over the companies’ alleged misuse of their work to train the large language models underlying their chatbots.
The case filed Monday is the second against Anthropic following a lawsuit filed by music publishers last year over its alleged misuse of copyrighted song lyrics to train Claude. Anthropic did not immediately respond to a request for comment Tuesday. A lawyer for the authors declined to comment. Amazon has invested $4 billion in Anthropic, which is itself an offshoot of OpenAI, the creator of ChatGPT.
The authors claimed in their complaint that Anthropic had “built a multi-million dollar business by stealing hundreds of thousands of copyrighted books.” Anthropic has received financial backing from sources including Amazon, Google and former cryptocurrency billionaire Sam Bankman-Fried.
According to the complaint, the authors’ works were included in a dataset of pirated books that Anthropic used to train Claude. The suit sought an unspecified amount of monetary damages and an order permanently enjoining Anthropic from misusing the authors’ works.