this post was submitted on 09 Jan 2026
224 points (98.3% liked)

Technology

78543 readers
3247 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Grok has yanked its image-generation toy out of the hands of most X users after the UK government openly weighed a ban over the AI feature that "undressed" people on command.

In replies posted to users on X, seen by The Register, the Grok account confirmed that "image generation and editing are currently limited to paying subscribers," a change from the previous setup in which anyone could summon the system by tagging it in a post and asking for a picture.

That access helped fuel a grim trend: users uploading photos of clothed people – sometimes underage – and instructing the bot to remove their clothes or pose them in sexualized ways. Grok complied.

The rollback comes as governments openly float the idea of banning or boycotting X altogether if it fails to rein in the abuse enabled by its AI tools. In the UK, screenshots of Grok-generated images quickly drew the attention of ministers and regulators, who began questioning whether X is complying with the Online Safety Act.


Update

The Verge and Ars seems to be claiming otherwise. However, I don't know for certain since I left Twitter ages ago.

top 25 comments
sorted by: hot top controversial new old
[–] Almacca@aussie.zone 3 points 1 day ago* (last edited 1 day ago)

I don't see how limiting it to paid subscribers is much of an improvement. Any amount of this material should be a punishable offense, shouldn't it?

Someone needs to start making images of Elon fucking Starmer in the arse. Maybe then he'll do something about it.

[–] CarbonatedPastaSauce@lemmy.world 97 points 2 days ago (1 children)

You would get thrown in jail, rightfully, for doing this once.

Corporation does it by the truckload and they are politely told to please stop, if they don't mind.

Not even the trivial, meaningless fines we're used to reading about.

This world is broken.

[–] RaoulDuke85@piefed.social 19 points 2 days ago

FBI is usually the biggest distributors of csam.

[–] ragepaw@lemmy.ca 46 points 2 days ago (1 children)

Limited to paying subscribers. So the only thing stopping people from creating CSAM, and fake sex pics of famous people is $8 a month?

Yeah... that'll show them.

[–] A_norny_mousse@feddit.org 14 points 2 days ago

But it will certainly drive up subscriptions 😉 ... 🤮

[–] Rekall_Incorporated@piefed.social 38 points 2 days ago (3 children)

With US oligarchs "threats" and discussions do not work. Only action and the type of action that hurts them personally.

[–] fonix232@fedia.io 14 points 2 days ago

Threats do work when delivered properly. See how the EU forced Apple into moving to USB-C or opening up sideloading/alternate app markets.

You just need to have weight to throw around behind your threats. The EU has weight. The UK alone? Hah.

[–] A_norny_mousse@feddit.org 5 points 2 days ago

Cynical take: They only did it openly to draw attention to it & pull in more subscribers. Because you can be sure Xhitter's clientele loves that feature.

"image generation and editing are currently limited to paying subscribers," a change from the previous setup in which anyone could summon the system by tagging it in a post and asking for a picture.

[–] Imgonnatrythis@sh.itjust.works 5 points 2 days ago (1 children)

Seems like this has some teeth at least. Elon usually doesn't budge at these sorts of legislation changes but this cleary has him a little nervous at least.

Perhaps, but if anything they should have used this as a cover to ban twitter. I do not find American style ostentatious "freedom polemics" to be convincing, but on top of that Musk has been found to meddle in local politics to benefit the far right and promote UK criminals.

[–] jmankman@lemmy.myserv.one 14 points 2 days ago (1 children)

Still waiting for an actual use case of image/video gen that isn't just for cheap assets or pure malice

[–] mycodesucks@lemmy.world 7 points 2 days ago (1 children)

But... those ARE the only two use cases.

[–] Deestan@lemmy.world 11 points 2 days ago (1 children)

Don't be so negative! It's also found a huge market in scams. Both for stealing celebrity likenesses, and making pictures and video of nonexistent products.

[–] mycodesucks@lemmy.world 3 points 1 day ago

I'm gonna say that still counts as "cheap assets"

[–] mannycalavera@feddit.uk 12 points 2 days ago (2 children)

Imagine being on the team at Twitter that had to work on this tool. Fuck me. You must hate your every living minute.

Ok so we need to define the acceptance criteria. GIVEN I am a user WHEN I click the button THEN grok should remove clothes from the image. Great. Any non functionals? Nope. Ok cool. Ship it.

[–] Almacca@aussie.zone 2 points 1 day ago

Anyone with a conscience or morals has surely quit or been fired from that hellhole by now.

[–] Damage@slrpnk.net 8 points 2 days ago

You assume they have morals, yet they work for X on an advanced feature

[–] m3t00@piefed.world 2 points 2 days ago

rule 34. yeah not her

[–] A_norny_mousse@feddit.org 0 points 2 days ago* (last edited 2 days ago) (1 children)

Sheesh. So far I thought it's just one of those things Xhitter's AI image generator can be told to do. But to have a specific ~~button~~ feature ... 🤮 Was it point-and click even?

Nothing says "we're for male sexists only, and we know it" like adding a specific ~~button~~ feature.

[–] clean_anion@programming.dev 3 points 2 days ago (2 children)

Is there a specific "undress" button? I tried looking for proof that it exists but couldn't find any (my searching skills clearly need work). Could you please share a screenshot or point me to a place where I can confirm that it exists?

It's not a physical button, but you could reply to any photo of any person (usually women, sometimes underage) and say "@grok put her in a micro bikini" and grok would happily dress down the person in the original image

[–] A_norny_mousse@feddit.org 0 points 2 days ago* (last edited 2 days ago)

sorry, I meant 'AI feature that "undressed" people on command' (from the article). No idea if it's an actual button, I don't use Xhitter. But I edited my comment now.