this post was submitted on 25 Nov 2025
268 points (98.2% liked)

Fuck AI

4635 readers
1696 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] pineapplelover@lemmy.dbzer0.com 18 points 20 hours ago

If chat control was supposed to save the children then why is this still alive?

It was never about the children

[–] altphoto@lemmy.today 4 points 18 hours ago (1 children)

Okay, I'm not in pro, nor opposed to fake nudes. But let's think about why they exist. We want to see people naked having sex. Its a type of entertainment and the performers are real live actors. That industry will collapse as we know it if AI takes over. Either that or Nabisco will add sex pictures on their chocolate wrappers because porn stars will just be another naked person having sex for a few bucks.

I for one really would like normal people having sex for my entertainment. If we let AI get the best of ourselves, porn will be, one day soon, gone forever! Muscle guys will forget how to pose together with another five guys so we can still see their penises going in and out of the actresses vagina. The actresses will not known how to prepare mentally and physically to be stretched in such an extreme way. That requires years of practice. I'm only half joking, it definitely takes some time to stretch out the various orifices in such a way.

I propose to only watch non-AI porn, because that's the only way to preserve our culture. That's the only way to keep the actors practicing their moves, the actresses practicing their orgasms and the talent acquisition team hard at work! They need to keep those audition couches filled and warm! And what about the support teams?

The camera guys need to remember the traditional ways of getting out of the ways of the balls swinging. The make-up girls, the fluffers, the microphone guy, the language translation teams and the special effects guy all need jobs! I mean...waka waka waka, 70's style music doesn't write itself! And the cleanup crue? They need the job! And they are ready with brushes, brooms, spatula, water buckets, stain removers, etc. And the furniture guys? Selling the odd couch or bed at consignment. Same for the dressing teams.

Its an industry we must keep supporting. Porn is us, porn is human. Thank you!

I'll take my check when you guys are done fondling each other.

[–] Doomsider@lemmy.world 1 points 11 hours ago (1 children)

That was great, thanks.

It does make me wonder if the traditional criticism of porn will go away with AI porn taking over.

[–] altphoto@lemmy.today 1 points 6 hours ago

It'll be like having a nack for old art forms...and so we went to the theater and watched them do 7 different classical porn positions... There was doggie, reverse cow girl, spitroast. It looked so real! It was a great play!

[–] Lemming6969@lemmy.world 4 points 19 hours ago (1 children)

This new tech means we need to get over nudity and what even is real vs fake in all media. Causing harm due to bodies, let alone fake ones is a societal and cultural failure.

[–] ThatGuy46475@lemmy.world 2 points 9 hours ago (1 children)

This is why futurama and starship troopers both have mixed showers

[–] atx_aquarian@lemmy.world 5 points 9 hours ago

No, Fry, you're just in the women's shower.

[–] phoenixz@lemmy.ca 35 points 1 day ago (2 children)

I fully understand this girl and I wish her well but this is a genie that left a bottle that it will never again get back in to, I'm afraid

[–] kittenzrulz123@lemmy.dbzer0.com 2 points 7 hours ago (1 children)

We can push this so far underground that the only people who use this are 4chan creeps on the dark web, we cant destroy AI but we can push it to the fringes

[–] luciferofastora@feddit.org 1 points 3 hours ago

It's like cleaning: You'll never get your space 100% spotless, and trying it will yield diminishing returns. Thus, the aim is to do "good enough" for the given context (an operating theatre may have higher standards than a storage room in your basement, for instance).

Likewise, pushing harmful things to the fringes has to be "good enough".

[–] Bane_Killgrind@lemmy.dbzer0.com 19 points 1 day ago* (last edited 1 day ago) (2 children)

Push it on to credit card processors, webhost operators, domain registrars etc proceeds of crime.

Edit: they are getting money from somewhere to run the computers these models run on.

[–] WolfLink@sh.itjust.works 5 points 10 hours ago* (last edited 10 hours ago)

I don’t want credit card processor being the judge of what I can spend my money on or domain registrars being the judge of what websites I can visit.

The person who did a crime should be taken to court and all the intermediaries should have the excuse of being neutral.

[–] Clent@lemmy.dbzer0.com 17 points 1 day ago (2 children)

There is no way back on that.

I can run these models on my local machine. It's not even a complex model.

This lawsuit is targeting the profiteers because that's the only reasonable recourse for an individual.

The criminal side of things is something a prosecutor needs to handle. Making this a priority becomes a political situation because it requires specific resources.

[–] Son_of_Macha@lemmy.cafe 3 points 21 hours ago (1 children)

Maybe we need to start pointing out it didn't make people naked, it just fits a naked body it saw in training under the person's head. It's Photoshop but faster.

[–] Clent@lemmy.dbzer0.com 2 points 21 hours ago

Not exactly. Head swaps have been a thing for a while.

These models match the body shape. They are essentially peeling back the layers of clothing. The thinner those layers the more accurate it can be.

[–] Bane_Killgrind@lemmy.dbzer0.com 3 points 1 day ago (1 children)

Right, and the people disseminating and hosting the tools tailored to criminal harassment should be held accountable, and the people hosting the resulting images. All of these people have their own revenue that can and should be disrupted.

[–] ozymandias@lemmy.dbzer0.com 6 points 1 day ago (1 children)

they’re not really tailored to that… also, it wasn’t hard to photoshop naked pictures that long ago.
but now these “tools” are neural net models… there are thousands of them hosted on dozens of source code repositories… and like op said, you can run it on any high end gaming gpu.
you can’t outlaw source code like that.
you could sue this one app maker and try to require that they prove consent and detect underage photos… totally a good idea, but it would do little to stop it…
they’ll just use a different app
i think they could prosecute the other people making and distributing the pictures though.

[–] DonPiano@feddit.org 3 points 23 hours ago (1 children)

I can easily get a rock from the woods, but if I use one to break someone's fingers, I should be prosecuted.

Except yeah, in this case we should go after the entire chain: AI trainers, hosters, users and their supplementary industries.

[–] ozymandias@lemmy.dbzer0.com 0 points 22 hours ago

not trainers…
people who host services that advertise “make any photo naked!” and have little to no safeguards against non-consensual or underage images, yes….
i mean, if they train the model ON child porn, then yes… but otherwise it’s not the model trainer’s fault.

[–] EgoNo4@lemmy.world 97 points 2 days ago (8 children)

Billions of dollars poured into AI for... Nudes. The stupidity of humanity has no boundaries...

[–] TriangleSpecialist@lemmy.world 57 points 2 days ago

Techbros trying desperately to solve the very real problem of them not getting any through removing that little obstacle in their way: consent.

[–] bitjunkie@lemmy.world 11 points 1 day ago (1 children)

One of the first pieces of recorded video was porn. New same as the old.

[–] EgoNo4@lemmy.world 10 points 1 day ago

I'm fairly certain that one was consensul though.

load more comments (6 replies)
[–] brianpeiris@lemmy.ca 30 points 2 days ago (4 children)

There ought to be a legal fund for these deepfake lawsuits so we can sue every one of these scummy companies out of existence. I'd donate to it.

load more comments (4 replies)
load more comments
view more: next ›