The plaintiffs embody two minors and an grownup who was underage when the occasions within the lawsuit occurred. One of many victims, recognized as “Jane Doe 1,” alleges that final December, she discovered that specific, AI-generated photos of herself and no less than 18 different minors had been obtainable on Discord. “At the very least 5 of those recordsdata, one video and 4 photos, depicted her precise face and physique in settings with which she was acquainted, however morphed into sexually specific poses,” the lawsuit claims.
The perpetrator, who has since been arrested, allegedly used Jane Doe 1’s AI-generated CSAM “as a bartering instrument in Telegram group chats with lots of of different customers, buying and selling her CSAM recordsdata for sexually specific content material of different minors.” The lawsuit claims the perpetrator generated the express photos of Jane Doe 1 and the 2 different victims utilizing Grok. It additionally alleges that xAI “failed to check the security of the options it developed” and that Grok is “faulty in design.”
Although X has tried making it tougher for customers to edit photos with Grok, The Verge has discovered that it’s nonetheless potential to control photos uploaded to the platform. X has maintained that “anybody utilizing or prompting Grok to make unlawful content material will undergo the identical penalties as in the event that they add unlawful content material.” X didn’t instantly reply to The Verge’s request for remark.
“These are kids whose faculty pictures and household footage had been changed into little one sexual abuse materials by a billion-dollar firm’s AI instrument after which traded amongst predators,” one of many victims’ attorneys, Annika Okay. Martin of Lieff Cabraser, stated in an announcement. “We intend to carry xAI accountable for each little one they harmed on this means.”
The lawsuit seeks damages for victims impacted by Grok’s “unlawful photos.” It additionally asks the court docket to stop xAI from producing and spreading alleged AI-generated CSAM.

