
gayvillian
12244
974
8

Link: https://arstechnica.com/tech-policy/2025/08/ai-industry-horrified-to-face-largest-copyright-class-action-ever-certified/
TLDR: A class action lawsuit is moving forward in the US by copyright holders who have had their work plagiarized by AI. Anthropic, an AI company who has used copyrighted materials to train its software, is fiercely trying to get the suit thrown out, as it could face fines of $150k for every one of the millions of complaints.
I remember when families were getting sued into oblivion because their kid downloaded a song on Limewire and didn't know it was illegal. Big companies got the DMCA passed in congress in retaliation. Now, giant companies are saying copyrights don't matter and they can do whatever they want. A group of researchers recently trained an AI model on public domain works and proved it's viable without using copyrighted works. There is no reason AI should steal copyrighted work. They either need to pay up or stop. It's completely illegal, but laws don't matter for corporations. It's absolutely amazing to me that corporate personhood is a legal concept in the US, but corporations can never be held accountable when they break the law.
If I was an artist, I'd be using the bullshit "lost revenue" argument that companies throw around against pirating. Chat GPT used my work, so I lost out on five billion dollars in sales. They owe me five billion dollars for stealing my work.
jackal12345
mrthewhitee
I genuinely don't understand how "we can't afford to pay" has been an acceptable argument so far. They need to pay for what they consume just like the rest of us.
Goldneyes
Good. I hope it ruins them all.
iamnatmann
I hate to be the bearer of bad news, but AI is very much a Pandora's Box situation. This tech isn't going to disappear, because it's not stemming from a single entity able to be squelched. You can generate and train any text, image, song, video, voice, etc locally on your own PC. When you get open source tech that versatile to that stage, it becomes impossible to get rid of. Regulation for the larger companies and their public models is the most that will happen. Beyond that, it's here to stay.
Goldneyes
I never thought it would go away. I do expect it the fever to die though and a lot of the startups to go under when the shine starts to wear off. Though the most horrible ones will probably be the ones to survive unfortunately.
EchoPMIM
More like "regulation too onerous for hobbyists and small businesses so that only the worst megacorps can use it" is the most that will happen. That's the likely end result of everything anti-AI folks are cheering for, anyway.
Goldneyes
yes, yes, the billionaires against regulation have been sending that smoke signal up in the hope it will fool common people for a while now.
EchoPMIM
Going DARVO won't get the taste of boots out of your mouth, bro. Hope you're at least getting paid well to astroturf!
SquirrelWithATophat
I am SOOO fucking sick of AI and how it is shoved down our throats everywhere we turn. I hope this lawsuit is so devastating to the AI industry that it NEVER recovers and is shut down completely,
VosperOfAntarctica
Remember what big IP did to our boy Aaron Swartz. May AI companies suffer a thousandfold fate.
Jimthebutler
Good. Fuck ai
Sasurau
Fuck theftbot, I think you mean.
Rubyrose99
Agreed. It's not even AI, it angers me almost as much as its existence, as it requires no intelligence to do what it does, and it's making everyone dumber
ThanosDei
AI is basically PDAs at this point, and we’re waiting for the iPhone to come out and solidify what exactly this tech is going to end up being. Because as of now, its shit.
masiakasaurus
Came here to post this
Jimthebutler
Brother. https://media0.giphy.com/media/v1.Y2lkPWE1NzM3M2U1dWFpaGtpNHZvdzd2eTh0azRzYmZweGhvbXR4Yzh3eWN6enY5bnBlcyZlcD12MV9naWZzX3NlYXJjaCZjdD1n/pHb82xtBPfqEg/200w.webp
BayazTheBenevolent
It is not a bad technology by itself, but it desperately needs to be reigned in and refocussed towards actually useful things instead of stealing people's work and producing mediocre content. Capitalism has gone absolutely crazy on the shit generators and it's doing a lot of harm to productive AI research.
Jimthebutler
Its poorly designed based off stolen work with poor oversight and compounding errors. this tool costs resources to use. There is a price to pay for every use of Ai and everyone is paying it. You want to talk about ethical ai? It needs to repair any ecological damage it does, pay for its sourced material, have responsible limitations and reductions of specific harmful information( like bomb making) and have specific use cases with catered ai versions that best fit the description of the tools job
BarryARief
Fuck BAD use of AI. It's never the tool's fault how someone wields it. I recently did an electrocardiogram test (I'm getting old and it was due) and my cardiologist looked over the many charts that the machine printed out and said "everything looks ok, but i'll just run the results through AI". AI said "89% certainty of coronary disease". Full angiogram undertaken. Yep. Open heart surgery in the near future to prevent a major heart attack. AI has probably just extended my life by 20 years.
RiskIt4ABiscuit
Nah, llms are specifically all bad.
tastelesstastyfreeze
AI that detects disease and that one that selects bread off the shelf for customers in that japanese bakery are def okay
SalmonTheWise
It's the same software that does both these things.
NaughtButOne
No, machine learning did that and it's been around for decades. The "AI industry" they're talking about is a techbro product manifested from thin air and commercialized in order to sell plagiarism and theft as innovation, and pilfer the entire human history of creative endeavour to further enrich the already wealthy.
NeurodivergenceMedley
ChatGPT and other similar ones (Thinking Clive.ai by Anthropic, based on the OP) are a specific subtype of pAI (pseudo-AI) called LLMs (Large Language Models). LLMs are almost exclusively used for bad purposes. Machine learning and neural networks - which are what do things like your EKG analysis - are much more properly used, and can't even converse in any meaningful way. Also, I've run neural networks on my home computer. They don't cause *near* the same level of environmental damage.
ThisNameUnavailable
LLMs are neural networks and are a part of the field of machine learning.
NeurodivergenceMedley
Yes, yes, and neural networks are a class of machine learning. But not all neural networks are LLMs and not all machine learning is neural networks (or LLMs)
ThisNameUnavailable
Yes also true
duktayp
OH SHIT THE ECONOMY
Jarjarthejedi
Ah, some actual good news for a change. I hope the con artists whose entire job is stealing other people's hard work in order to steal from stupid investors get sued for every penny they have. I know they won't, because the supreme court will step in with some nonsense from the 1600s about how businesses stealing from people is a sacred right or something, but them panicking is good to hear. :)
sleepinggreenidea
Don't get your hopes too high. Copyright-based lawsuits against AI companies for training on copyrighted works don't have a good track record so far, because the Copyright Act was not written to include a right against use as AI training data and the existing protections it DOES grant don't really line up with what AI training actually does.
ConfusedConda
Those companies deserve to go bankrupt.
It will NOT crush the industry. As said in the post, some researchers have said it can be done on nothing but public domain information. New companies will pop up doing it that way.
The big problem is not AI. The bug problem is the wealthy and the "human" corporations.
The big problem in almost all the problems in the world is the wealthy and how they are taking control of the US, and how their sights are lined up on other countries.
bippityboppitybuttsex
Burn these career criminals (every AI company) to the ground.... they are criminal enterprises stealing everyone's work to make money, they not only need to be destroyed, they need to be held civilly liable for actual damages.
mafiacarstarter
Actual damages, theoretical lost potential revenue, emotional damages, and legal fees.
bippityboppitybuttsex
They won't be able to pay the Actual Damages... as soon as the lawsuit wins, AI will be worth less than a Cybertruck for sale in San Francisco
mafiacarstarter
Being able to pay, and being subject to a judgement are two very separate things. Their inability to pay should in no way affect the size of the judgement against them.
This case should push the boundaries of the liability limits in limited liability companies, as the founders knew going in that their business model was entirely dependent on criminal activity. They and all of the shareholders should be held personally financially liable for the damages.
bippityboppitybuttsex
Once you declare Chapter 7, civil judgements are basically moot since you are already in liquidation....
I would go after them criminally for the billions of thefts they committed... either Copyright matters or it doesn't... why is an 11 year old girl getting fined $500k for sharing a Metallica song, but these AI companies can steal everything and it is ok?
angelyric
Never thought I'd cheer for DMCA, the recording industry, and movie studios, but here we are
AI, as it is currently being utilized, is anti-human cancer
Onlyhereforthelaughs
Good, AI can die.
Josh25AP
AI companies can use any copyrighted work they legally own according to the law, but chose to pirate everything. If even 1% of the books results in a fine 150k, total result is 10.5 Billion in fines.
TektronixTDS360
Can't wait to see the conservative justices' heads explode when they need to pick between the two conservative priorities of stronger intellectual property rights and licking the boots of the rich.
KingORedLions
The hilarious part of it is that it's the same companies on both sides. Everyone wants to stop everyone else training on their data (e.g. record labels, Disney, etc.), but they also simultaneously want carte blanche to pursue their own version of it.
VibratingNipples
GreaseMonkeyOfLove
GOOD!
bippityboppitybuttsex
We need to change corporate law so the C-suite and board of directors are criminally and civilly liable for the unlawful activity of their companies (an exception if an employee violates the law and the rules of the company, this would apply to policy decisions, e.g. Wells Fargo's board of directors, and Tesla and GM's would die in prison, esp if a death resulted).
Also, noone from Union Carbide, Nestle, Purdue Pharma, any Tobacco company, should be allowed to walk free, they are mass murderers
ThisNameIsMaybeTaken
For others wondering, Union Carbide killed people thru asbestos and mining practices.
https://en.m.wikipedia.org/wiki/Union_Carbide
bippityboppitybuttsex
That and Bhopal...
https://en.wikipedia.org/wiki/Bhopal_disaster
marsilies
I mean, they already can be found criminally liable, but the federal government rarely goes after them anymore. John Oliver just did a segment on Deferred Prosecution Agreements that goes over how corps and their execs wriggle free from criminal prosecution. https://www.youtube.com/watch?v=xNo8Ve-Ej6U
bippityboppitybuttsex
I saw his piece and when was the last time an individual was charged, criminally, for their sanctioned corporate actions?
I am not talking about rogue employees being charged, I am talking about someone ordering unlawful policies, at a high level, and spending time in jail.
marsilies
My point was that the problem isn't that we need to change the law, as you stated, but that we need to change enforcement.
SavageDrums
It'll never be profitable, sucks to be every techbro asshole who bet their fortune on this shitty tech.
Chadly5046
Burn those companies to the ground.
NKato
This is literally going to destroy the genAI industry.
Good. Fuck them.
lausianne
one problem is, it might destroy western AI industry. But not of other countries. Specifically China. Who's going to sue them.
NKato
China has actually instituted some degree of regulations on AI. Unlike here.
sleepinggreenidea
It's not going to destroy the genAI companies unless the courts decide to overturn the precedents that have been developing over the last decade. Caselaw ATM does not support the idea that training on copyrighted works is in and of itself a violation of copyright. That precedent isn't set in stone, and SCOTUS is always a loose cannon when it comes to IP law (they don't understand it, so they rule on vibes), but based on where it stands now, copyright law needs amended to make AI training illegal
Ivalicenyan
Meanwhile Facebook downloading just terabytes and terabytes of porn hub cause reasons
Raventhief
We built our entire industry on illegal copyright infringement! A lawsuit over the legality of copyright infringement could break our industry!
sleepinggreenidea
There's been a number of lawsuits based on AI training, and the consensus developing from those cases is that unless there's some additional offending acts being performed, the training itself is not a copyright violation. So don't get your hopes up too high.
SalmonTheWise
mondeca
"But all my money comes from robbing banks! If bank robbing is illegal, I'll be ruined!"
gayvillian
Replace "banks" with "poor people," and you've explained capitalism.
iguessihaveto
Too big to fail! Too big to fail! >:-(
Stefnos
* "We built oure entire industry on stealing" xD
bippityboppitybuttsex
Good.
DukeDarkwood
To which I have only one thing to say:
Eff. Ay. & Eff. Oh.
gayvillian
The thing that worries me is there are entire industries, particularly in finance, based on things that were completely illegal until politicians were bribed to make them legal. Since people made tons of money on the stuff that used to be illegal, it only reinforced this loop. They're gonna throw so much money at politicians to make stealing work legal, just for AI, and that'll be that.
sleepinggreenidea
The status quo is that there's not clearly any right to not have your works used for AI training. Copyright is a limited grant of specific rights, under specific circumstances, for specific purposes, with specific exceptions. AI training does not a priori comprise an activity protected by copyright law. Now, politicians could amended that law to include it... but right now, the law - as currently written and interpreted - does not generally protect works from being trained on.
MrAristo
That argument fails when you remember that these companies made illegal copies of copyrighted work so they would then have something to use in the training of their LLMs. And now they're crying that if they are held liable for making those illegal copies (a violation of copyright) then the punishment might destroy them. They were happy to FA, but now upset they're in the FO stage.
sleepinggreenidea
Actually, no, the courts disagree. The argument only fails if they *retain* the copyrighted works after training on them. The courts have been increasingly clear and consistent on this point - training models on a copyrighted work is transformative fair use based on transitory copying, and saying that it's not would be reading language out of the Copyright Act. IF they retain a library of the copyrighted works after using them, the courts have deemed it to be illegal... but just training? No.
Raventhief
Yeah, there is that. I'm guessing a copyright law that basically says only companies worth a billion or more can hold a copyright.
TanithRosenbaum
Not just in finance. Look into what today's hyper-litigious media houses got their start with in 19th century britain. Hint: it wasn't properly licensed content...
Stefnos
and then we steal all thire AI work and turn it into a massiv super hacker AI that just constantly crashes thire AI and all thire systems.... what could go wrong.... >.>.... <.< .... >.>
alcamar
They already were doing so, basically tried to give AI carte blanche from (State) law for 10 years. Looks like it was defeated, for now at least.
But basically huge money is already behind that effort, with musk and zuck and their literal tonnage of money in particular.
KingORedLions
Money spent on building new datacenters for AI outstripped consumer spending in the US this year - I can't stress how ridiculously astronomical the amount of money invested into this is, and when it fails, it's going to bring the entire US economy down with it in truly spectacular fashion.
The powers that be know that and will do whatever they can to keep propping those business up.
alcamar
Prop them up until they extract everything they can. It's a fire sale.
Enoan
That is insane. I'd love to see a source on this.
TresusIbor
:D
ToSisPoS
That would please me.
damion20xx
https://media4.giphy.com/media/v1.Y2lkPWE1NzM3M2U1YTEyM2FvbjBsbXRjYXBoMTRpaXpiemZwOWV4Y3JyeThhMDV0NG93ZCZlcD12MV9naWZzX3NlYXJjaCZjdD1n/7k2LoEykY5i1hfeWQB/200w.webp
wazeewa
When I saw "financially ruin the entire AI industry" this GIF came to mind
CorgisButtsDriveMeNuts
AI is not a fucking industry, it's reshuffled garbage posed as something of value.
errantcompass
I love what it can do but all of the companies that stole shit must have known this was coming
armagetz
“Big companies got DMCA passed in retaliation.” You must have been very young or not watching world affairs then, because you are stretching reality quite a bit to make a false equivalency. While the RIAA (a trade organization, not big companies going after people if you want to split hairs) was enthusiastically supportive of DMCA, the US laid the framework and committed to its passage of the bill 3 years before Napster even existed. To say DMCA was “in response” is a flat out lie.
gayvillian
Also, media companies like Disney (might be a big company?) and others were definitely behind the DMCA, not just the RIAA.
gayvillian
DMCA passed in 1998. Napster showed up in 1999. Sorry I got one year backwards. But file sharing (warez) were around long before 1998.
armagetz
Yeah but the US was a signatory to 2 international treaties that laid the general framework within the World Intellectual Property Organization in 1996. Napster was founded in 1999 but didn’t get mainstream attention till mid 2000. European law actually modeled a novel introduction by the US in offering protections to ISPs.
armagetz
And you are crawfishing hard, with a lot of revisionist history if you are trying to claim now that file sharing was on the music industry’s radar in the mid 90s
gayvillian
So the DMCA was signed into law in 1996? Or something that started with treaties and eventually turned into it after 2 years of debate and industry comment was signed into law in 1998?
armagetz
No. But we can sign commitments in treaty to make modifications and hammer out the details internally. 80%-90% of DMCA was agreed to in international digital IP in 1996. the point I called you out on (which you are trying to smokescreen) is claiming DMCA was a punitive response by big music industry to music file sharing. It objectively and empirically wasn’t. The ball was rolling a half decade before the music industry began to care. Stop goalpost shifting.
gayvillian
I referenced limewire but was it possible to illegally download copyrighted material online or copy a CD, tape, or VHS before the DMCA? Or did all that technology appear after 1996? Peer to peer existed but was solidified by Napster in 1999, yes. Other forms of media copyright infringement by consumers existed before 1996.
JustAPileOfCats
Good, I hope the suit destroys the AI industry. The AI bubble bursting will be the new Dot Com Crash.
gayvillian
The problem with the dot com bubble is that nobody had to use websites, and many overestimated the consumer need and valuation of their website. This time, AI is being forced on people. It's online, it's on your phone, your computer, it's everywhere. You have to go out of your way to not use it. I really want the bubble to pop, but when you put your product on everyone's phone without their permission, you can say they're all users even if they dont use it.
LordHosk
And then, just as easily as they wrote the code to put it on, they can remove the code.
Thats how writing code works. They can say its impossible, but if they wrote the code in, they can take the code out, they just don't want to put in the hours.
Bonana
While I agree, I suspect "we have to rewrite everything" excuses because they hard coded thisShit instead of making it easily commented out.
supervillin
Over/under on whether lawyers used AI to write up the brief?
truthputer
Using the tools of your enemy to destroy them is just common sense.
sleepinggreenidea
It really isn't, because if you're using AI to generate legal arguments right now, your motions will be garbage that is thrown out, and you'll very possibly face sanctions. Even dedicated closed-universe legal LLMs aren't good at generating novel arguments right now... though they're getting better rapidly.