AI Giant Anthropic Hit with Massive $3 Billion Piracy Lawsuit Over 20,000 Stolen Songs
A scathing lawsuit has been filed against AI company Anthropic, alleging that the firm engaged in "flagrant piracy" by downloading over 20,000 copyrighted songs without permission. The music publishers behind the suit, including Concord Music Group and Universal Music Group, claim that Anthropic's alleged actions could result in damages totaling more than $3 billion – making it one of the largest non-class action copyright cases in US history.
The lawsuit accuses Anthropic of using its vast network to illegally download iconic tunes from The Rolling Stones, Neil Diamond, Elton John, and many others. These songs were then used for training purposes, specifically on the company's chatbot Claude. Music publishers handling artists like Common, Killer Mike, and Korn have also been implicated in the alleged piracy.
The lawsuit's authors claim that Anthropic's actions demonstrate a clear disregard for copyright law and that the firm has profited from "multibillion-dollar business empire" built on stolen content. This is not an isolated incident; music publishers point to previous findings during the discovery process of last year's Bartz v. Anthropic case, where similar piracy was uncovered.
The implications of this lawsuit are significant, particularly in light of a 2022 ruling that found it lawful for Anthropic to train its models on copyrighted content – but not for acquiring such content via piracy. The court's decision made a distinction between the cost of purchasing copyrighted materials and the value of the firm's overall network; essentially stating that spending $1 per song would have been sufficient, yet Anthropic chose to skirt around this rule by downloading content without permission.
The value of Anthropic itself is estimated at over $350 billion, but the consequences of its alleged actions are a different matter. The music publishers' lawsuit paints a picture of an AI giant built on the back of intellectual property theft, and it remains to be seen how this will play out in court.
A scathing lawsuit has been filed against AI company Anthropic, alleging that the firm engaged in "flagrant piracy" by downloading over 20,000 copyrighted songs without permission. The music publishers behind the suit, including Concord Music Group and Universal Music Group, claim that Anthropic's alleged actions could result in damages totaling more than $3 billion – making it one of the largest non-class action copyright cases in US history.
The lawsuit accuses Anthropic of using its vast network to illegally download iconic tunes from The Rolling Stones, Neil Diamond, Elton John, and many others. These songs were then used for training purposes, specifically on the company's chatbot Claude. Music publishers handling artists like Common, Killer Mike, and Korn have also been implicated in the alleged piracy.
The lawsuit's authors claim that Anthropic's actions demonstrate a clear disregard for copyright law and that the firm has profited from "multibillion-dollar business empire" built on stolen content. This is not an isolated incident; music publishers point to previous findings during the discovery process of last year's Bartz v. Anthropic case, where similar piracy was uncovered.
The implications of this lawsuit are significant, particularly in light of a 2022 ruling that found it lawful for Anthropic to train its models on copyrighted content – but not for acquiring such content via piracy. The court's decision made a distinction between the cost of purchasing copyrighted materials and the value of the firm's overall network; essentially stating that spending $1 per song would have been sufficient, yet Anthropic chose to skirt around this rule by downloading content without permission.
The value of Anthropic itself is estimated at over $350 billion, but the consequences of its alleged actions are a different matter. The music publishers' lawsuit paints a picture of an AI giant built on the back of intellectual property theft, and it remains to be seen how this will play out in court.