Anthropic agrees to pay at least $ 1.5 billion in the AI ​​printing settlement


Anthropology has agreed To pay at least $ 1.5 billion for settlement by a group of book writers claiming to be a violation of printing is estimated at about $ 3,000 per task. In the petition on Friday, the plaintiffs emphasized that the settlement of the settlement is “important victories” and that going to trial would be a “huge” danger.

This is the first settlement of the first class with the focus of artificial intelligence and the right to print in the United States, and the result may show how the regulators and creative industries approach the legal discussion of artificial intelligence and intellectual property. According to the settlement agreement, the class measures apply approximately 500,000 works, but the number may rise after the list of pirate materials are finalized. For each additional task, an artificial information company pays extra $ 3,000. The plaintiffs intend to submit the final list of works to court by October.

“This prominent settlement goes beyond any other known copyright recovery. This is the first of its kind in the course of artificial intelligence. This provides significant compensation for each class work and creates a record that requires AI companies to pay for printing owners. Godfrey llp.

Anthropic does not confess to a violation or responsibility. “Today’s settlement, if approved, eliminates the claims of the remaining legacy of the plaintiffs,” said Aparana Sidor, a deputy director of the people.

The petition, originally formed in 2024 in the US regional court for the northern region of California, was part of the larger wave of printing lawsuits against technical companies on data they used to teach artificial intelligence programs. The authors of Andre Bartz, Kirk Wallace Johnson and Charles Graber claim that anthropology teaches large models of his language without authorization and violates the law of printing.

In June, the senior judge of the William Allasope district ruled that AI Anthropic training is protected by the “fair use” doctrine, allowing to perform unauthorized use of printing works under certain circumstances. It was a victory for the technology company but was accompanied by a great caution. By gathering material to teach his artificial intelligence tools, Anthropic relied on a book from the so -called “shadow libraries” books, including the infamous Libgen site, and Alsup found that authors still need to be able to do more anthropology in a class practice. (Anthropic stated that he did not actually teach his products on pirate works, instead he decided to buy books.)

“Alsup wrote in his summary:” Anthropic loaded more than seven million copies of the pirate book, paid nothing, and kept these pirate versions in his library, even after deciding to use them to teach his artificial intelligence (or again). The authors believe that anthropology should pay for these library versions.

Leave a Reply

Your email address will not be published. Required fields are marked *