this post was submitted on 10 Jan 2026
37 points (100.0% liked)

Pravda News!

173 readers
608 users here now

founded 2 months ago
MODERATORS
 

As we’ve reported, people have been using Elon Musk’s Grok AI to generate sexualised images of real women without their consent. This has seen users generate images of women in bikinis. It’s also seen them generate women in see-through bikinis. It’s additionally seen users generate images of children in bikinis. Oh, and if that wasn’t bad enough, users generated images of a murdered woman in a bikini within hours of a federal agent shooting her to death.

The term for non-consensual sexualised images of real people is ‘revenge porn’. The term for sexualised images of children is Child Sexual Abuse Material (CSAM). Both are illegal, which is why the UK government has demanded Musk rid his disgusting website of this filth.

This is how Elon Musk was talking about the matter as of 4 hours ago:

Exactly https://t.co/XSDUkzPfqq

— Elon Musk (@elonmusk) January 10, 2026

There are several massive problems with this

So let’s go through these questions one by one. First up:

So what if Grok can put people in bikinis?

As noted, the problem is that Grok is doing this to real women and children — both crimes. It is possible to put guardrails on AI generators like Grok, and recent reporting suggests we’re now seeing revenge porn run rife because Musk removed these protections:

Obviously no one wanted to see the images with children and xAI has said they've taken action to fix the problem. But meanwhile, Grok's public replies on X are still full of it complying with image requests to remove clothes or put people (mostly women) in sexually suggestive…

— Hadas Gold (@Hadas_Gold) January 8, 2026

The issue isn’t that Grok can put women in bikinis; it’s that it shouldn’t, and it’s possible to stop it.

Lets have it right Nigel – if any other online platform was generating sexually explicit images of children and women, you'd want it restricted, but this one is MAGA endorsed and pays you thousands to post on it. Honestly, this man would sell his granny for a tenner. https://t.co/uUarYpflow

— Oliver Ryan MP (@OliverRyanUK) January 9, 2026

Next question:

So can photoshop? So can millions of apps already?

Those apps should be deleted, and their creators should face legal action where appropriate. We’re facing a ‘white collar crime’ epidemic right now, in which laws simply do not apply if you break them in a business-style capacity. This needs to end.

Regarding ‘so can Photoshop’, the issue is scale, speed, and ease of use. For a person to create a perfect Photoshop of a real woman in a bikini, they would have to spend considerable time learning the tool; they would also need to spend time generating each image. In the time that would take, the greasiest loser on the planet could generate billions of images of your child using these wretched AI tools (because Grok isn’t the only one with the problem).

The new

Second to last sentence:

This isn’t a new problem, it’s a new tool.

Yes, it’s ‘new’ — i.e. we can’t tackle it with the ‘old’ mindset. The response to these AI generation tools needs to match their capabilities, as well as the scale of their potential. If the problem is rife, unfettered revenge porn, then there are two possible actions:

  • Musk and other AI owners stop their tech from generating revenge porn.
  • These tools go offline until such a time that they can be rendered safe.

And finally:

If a user does it someone unconsentually [sic], the user should be punished, not the platform.

If someone invented a ‘remote murder machine’ that could kill with a click of a button, we would not simply be blaming the clickers; we would be dismantling the machine and arresting the owner. The fact that Elon Musk stands to profit from this hideous technology does not mean we should wave a hand to it.

AI image generators are not vital progress in the same way that the wheel or the steam engine were; they’re a novelty with few useful applications and many negative ones.

Maddening

The state of play right now is that the UK government is talking about blocking X/Twitter (the social media site where most people access Grok):

🚨 NEW: The Government says Ofcom now has its "full support" to block access to X in the UK if it doesn’t take action over Grok

— Politics UK (@PolitlcsUK) January 9, 2026

UPDATE: Ofcom says it has received a response from X about Grok

“We urgently made contact on Monday and set a firm deadline of today to explain themselves, to which we have received a response. We’re now undertaking an expedited assessment as a matter of urgency and will…

— Politics UK (@PolitlcsUK) January 9, 2026

Musk posted his message at the top after Ofcom said they’re speaking to X. In other words, it looks like Musk cares more about giving people unfettered access to revenge porn than he cares about not being seen as the revenge porn guy.

Although — to be fair — you can only generate revenge porn if you pay Musk for the pleasure now:

First he monetized disinformation, now he’s monetizing your desire to be a pervert. Now you gotta be a paying customer before you can order grok to put her in a bikini or turn her around. pic.twitter.com/yei6exQR9s

— Biggest Mack (@Big_Mck) January 9, 2026

While many doubt the government will actually do anything, we’re getting to a point where it’s hard to justify to the wider public why we allow this disgusting pervert to run his social media platform as if it were Jeffrey Epstein’s paedo island.

Featured image via X

By Willem Moore


From Canary via This RSS Feed.

top 3 comments
sorted by: hot top controversial new old
[–] ivanafterall@lemmy.world 2 points 1 day ago

"But nobody will let me see them nekkid otherwise!?"

[–] hesh@quokk.au 1 points 1 day ago* (last edited 1 day ago)

Apparently free users can still generate images with grok without paying. They just can't do it in an @grok reply

https://www.theverge.com/news/859309/grok-undressing-limit-access-gaslighting

[–] Spacehooks@reddthat.com 1 points 1 day ago

I know there is a NSFW filter possibility but he refuses to do allow it is wierd.