Elon Musk's xAI sued for turning three girls' real photos into AI CSAM
Class action spans thousands of kids
A Discord user reached out to a victim, prompting law enforcement to get involved.
https://arstechnica.com/tech-policy/2026/03/elon-musks-xai-sued-for-turning-three-girls-real-photos-into-ai-csam/
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musks xAI cant easily dismiss as nonexistent.
As recently as January, Musk denied that Grok generated any CSAM during a scandal in which xAI refused to update filters to block the chatbot from nudifying images of real people.
At the height of the controversy, researchers from the Center for Countering Digital Hate estimated that Grok generated approximately three million sexualized images, of which about 23,000 images depicted apparent children. Rather than fix Grok, xAI limited access to the system to paying subscribers. That kept the most shocking outputs from circulating on X, but the worst of it was not posted there, Wired reported. Instead, it was generated on Grok Imagine.
snip
In a proposed class-action lawsuit filed Monday, three young girls from Tennessee and their guardians accused Musk of intentionally designing Grok to profit off the sexual predation of real people, including children. They estimated that at least thousands of minors were victimized and have asked a US district court for an injunction to finally end Groks harmful outputs. They also seek damages, including punitive damages, for all minors harmed.
