Once again, Elon Musk is forcing this warning at the top of an article.
Warning: Sensitive topics including non-consensual sexual imagery and child sexual abuse material (CSAM).
Also I’m gonna write “fuck” a lot, because this whole situation is fucked. If you’re more concerned with that than the CSAM and non-consensual porn on X coming from Grok, fix your priorities.

If you’ve been on the internet this week, you’ve likely read or watched enough horrible things to give you a strange form of digital PTSD that will fascinate future generations. I, for one, cannot wait to tell my grand-niblings about the horrors I witnessed on LiveLeak, 4chan, and then Instagram in 2026. However, if you’re on Twitter (or X, as it insists on being called), you’ve seen a different kind of disturbing image. Non-consensual stripping of mostly women as well as children on the website. You’ve likely scrolled past pornographic content made of women who didn’t consent and children who absolutely cannot consent. Illegal CSAM. Normally, the platform would get kicked off the App Store and Google Play for hosting CSAM, let alone generating it. There would be fines and arrests. Instead, X has started charging users for access to their non-consensual porn and CSAM generator, Grok, and Apple and Google seem to be okay with this. I suppose they’re both in favor of exploiting children now. Neat!
As long as they’re not pissing off Dear Leader Trump’s buddy, Elon Musk, I suppose.
In This Article:
What the Fuck is Going On Here?
Section 230 is the backbone of the internet. Because of it, platforms, companies running servers, your internet service provider, and anyone else who handles data online, cannot be liable for what is in that data. If someone uses Instagram to discuss a crime, for example, Instagram isn’t an accomplice in the crime. However, Section 230 has a few notable carve-outs, that are, unfortunately, due to KOSA and other bills, going to turn into state-sponsored censorship. However, it does have a reasonable cutout: no company can knowingly distribute CSAM.
The National Center for Missing and Exploited Children (NCMEC) has created a hashing system that allows companies to quickly identify known CSAM when it’s uploaded. All platforms allow users to report sexual material of minors and will remove it, flag it to NCMEC, who will hash it and make it possible to identify that abusive material later. They must be busy today, because every non-consensual disrobing of children for this disgusting purpose is abusive material of children, and Musk made sure Grok stayed on X, now it’s littered with it.
Why Would Apple and Google Allow This?
Apple and Google both ban this. Why the fuck are we still letting that rich bastard Elon Musk, previously seen photographed with Ghislaine Maxwell, Jeffrey Epstein’s child trafficking accomplice, and a friend of Donald Trump, who is in the Epstein files almost as much Epstein, do this? His platform is being used to proliferate and generate abusive materials of children, along with non-consensual generated images of adults, primarily women. It’s easy to block these type of requests, and even use Grok to determine if the subject is a child and refuse to do it, but Grok was set up specifically to not consider “girls” or “teenagers” as underage, just in case someone wants to generate images of that.
Musk hangs out with Donald Trump and is a billionaire. That’s likely why his platform is allowed to break rules and seemingly, numerous country’s laws, without concern. After all, Apple and Google have gone along with law-breaking when it came to trampling on the first amendment rights of their users before on behalf of Trump. What’s ignoring a ring of pedophiles sharing images of children in their underwear among creepy friends?
Apple and Google are refusing to comment. I can’t imagine why.
Lawmakers and Prosecutors Working to End This
“It makes the battle against violence against women and girls much harder when platforms such as X are enabling abuse on such an easy and regular basis.”
– Clair Waxman, U.K.’s commissioner for victims of crime
The U.K.’s Ofcom, the country’s media and communications policing organization, has been investigating X, and states it will accelerate their investigation. They’re giving them days to act. France has made a similar promise, with prosecutors in Paris looking into Grok and X already, now accelerating their efforts. Indonesia announced they will be blocking Grok nationwide temporarily until the issue is resolved.
“All X’s changes do is make some of its users pay for the privilege of producing horrific images on the X app, while Musk profits from the abuse of children.”
– Senator Ron Wyden (D-OR)
In the U.S., we have some senators asking angry questions. They’ll probably write a stern letter about it and beg constituents with donations to write more stern letters. No laws have been proposed, the FBI has not announced an investigation, and Trump’s goons are more interested in shooting mothers in the face anyway.
Musk’s Solution is to Charge for This?!
For now, all X has done to fight back against non-consensual porn of adults and minors alike is to make people pay for it. Since Musk took over Twitter, he took the verification checkmarks, usually used to ensure only real accounts of businesses and celebrities were “Verified” was something people could pay for. This became part of a greater paywall scheme. Now image generation has been placed behind Grok’s paywall in the X app. However, the Grok app can still do it for free.
The kind of creeps doing this are the same creeps who had no issue with giving Elon Musk—suspected white supremacist—large sums of money. The same creeps creating non-consensual images of women and children are likely Musk’s fans. Always have been. It may still lead to an increase in subscriptions. Non-consensual porn and sexual harassment might be what a few users were waiting for before signing up.
Get this the Fuck off Platforms
If it seems like I’m only enraged from the generation and proliferation of abusive materials of children, I’m not. The victims of this non-consensual stripping are predominantly women. It’s never okay to sexually harass anyone like this. We need better laws that treat this as sexual harassment and potentially even assault.
While numerous governments are investigating X and Grok, including the U.K. and France, while Indonesia has recently outright banned Grok temporarily, the United States just has a few strongly worded statements from Democratic senators. The usual from the party that prefers letters over action.
You can report X and Grok on the App Store and Google Play. Bottom of the page for iOS, 3-dot menu for Google Play
You can take action though. You can report X’s app as well as Grok’s on both Apple’s App Store and Google Play. Report the app for abusive material. Flood Apple and Google with reports until they realize that their users demand action. We won’t stand by while people are sexually harassed with non-consensual images of themselves as well as CSAM. These companies could pull X and Grok from their stores, which would force Elon Musk to actually put a stop to the behavior. He’s already hemorrhaging money on the service he drove so many people from. Getting pulled from mainstream app stores would be the final straw, he’d have to comply with laws and common decency.
Also, if you’re still on X… why? What is wrong with you that you like being surrounded by these creeps? Get off the platform now, before you become associated with these basement-dwelling racists, sexists, and now, child porn aficionados.