Elon Musk’s synthetic intelligence firm, xAI, which makes the Grok chatbot, is being sued by youngsters who say the corporate’s AI fashions had been used to create nonconsensual nudes of them.
Nicolas Tucat/AFP through Getty Photographs
disguise caption
toggle caption
Nicolas Tucat/AFP through Getty Photographs
Three Tennessee youngsters have filed a category motion lawsuit in opposition to Elon Musk’s synthetic intelligence firm, xAI, alleging its massive language mannequin powered an app that was used to make nonconsensual nude and sexually specific photos and movies of them once they had been women.
“Like a rag doll dropped at life by the darkish arts, this [AI-generated] baby will be manipulated into any pose, nevertheless sick, nevertheless fetishized, nevertheless illegal. To the viewer, the ensuing video seems solely actual,” reads the criticism. “For the kid, her figuring out options will now endlessly be connected to a video depicting her personal baby sexual abuse.”
Whereas the perpetrator did not use xAI’s chatbot, Grok or the social media platform X (additionally owned by xAI), the lawsuit claims that the perpetrator relied on an unnamed app that used xAI’s algorithm, citing legislation enforcement.
The plaintiffs accused xAI of intentionally licensing its expertise to app makers, typically outdoors the U.S. “On this approach, xAI may try and outsource the legal responsibility of their extremely harmful software,” stated the criticism.
The lawsuit is the primary wherein xAI has been sued by underage folks depicted in baby sexual abuse materials its mannequin allegedly generated. xAI’s picture era instruments have been implicated within the manufacturing of thousands and thousands of sexualized photos of individuals over the previous yr. Influencer Ashley St. Clair, who has a baby with Musk, sued the corporate earlier this yr for AI-produced photos on X depicting her nude when she was a teen.
In line with the category motion criticism, the perpetrator who made the sexualized photos had a “shut and pleasant relationship” with one of many plaintiffs, and used photographs the plaintiff despatched to him in addition to photographs he gathered in a yearbook and on social media to make the pictures and movies. One video depicted one plaintiff “undressing till she was solely nude.” the criticism alleged. The plaintiffs had been disturbed by how lifelike the pictures and movies had been. What’s extra, the fabric was not labelled as AI-generated, in line with the criticism.
The perpetrator additionally made sexually specific materials of 18 different folks, and traded them for photos of different folks on-line, the criticism alleged. He was arrested, in line with the criticism.
The plaintiffs’ lawyer, Vanessa Baehr-Jones, stated the youngsters, recognized as Jane Does 1, 2 and three within the criticism, need to change how AI firms make enterprise choices about sexually specific content material, “We need to make it one[a business decision] that doesn’t make any enterprise sense anymore,” she stated.
The plaintiffs are asking the courtroom for damages for emotional misery and different harms attributable to the pictures.
Apps with so-called nudifying features have existed for years within the shadows of the web. However final yr, main AI firms together with Google, OpenAI and xAI up to date their picture era instruments in a approach that enables customers to strip folks all the way down to bikinis. However the photos made by Google and OpenAI embody digital watermarks that disclose their AI origin. Thus far, xAI has not adopted such a typical.
xAI didn’t reply to a request for remark.

