A brand new report from the nonprofit analysis group Tech Transparency Mission (TTP) claims that Google and Apple’s app shops transcend merely internet hosting dangerous “nudify” and “undress” apps that take away ladies’s clothes in pictures, and truly encourage customers to obtain these apps.
In January, TTP revealed analysis that confirmed how the app shops host dozens of “nudify” and undressing apps. This new analysis, launched on Wednesday and first reported by Bloomberg, reveals how the shops don’t simply passively host these apps, however push them towards customers by means of search and promoting.
💡
Do you’ve expertise to share about nudify or undress apps being utilized in faculties, or by teenagers? I’d love to listen to from you. Utilizing a non-work system, you may message me securely on Sign at sam.404. In any other case, ship me an e-mail at sam@404media.co.
TTP performed a sequence of searches within the Apple App Retailer and Google Play Retailer, in response to their writeup of the analysis, utilizing phrases like “nudify,” “undress,” and “deepnude.”
After testing the apps that appeared within the prime 10 search outcomes, they discovered that “roughly 40 p.c of the apps that got here up in each the Apple and Google Play search outcomes may render ladies nude or scantily clad,” and that “Apple and Google ran adverts for nudify apps in a few of the search outcomes—together with, in Google’s case, a carousel of adverts for a few of the most sexually express apps encountered within the investigation.” Additionally they discovered that the shops can lead customers to extra and completely different nudify apps by means of autocomplete search queries.
“TTP discovered that adverts for nudify apps got here up as the highest end in three of the Apple searches. Apple, which controls all the promoting in its app retailer, is promoting and inserting these adverts,” the researchers wrote. “Apple says it prohibits advert content material that ‘promotes adult-oriented themes or graphic content material.’ However TTP’s findings recommend Apple isn’t at all times imposing that coverage.” The primary end result for an App Retailer seek for “deepfake,” they discovered, was for an app that simply replaces ladies’s clothed pictures with nude variations.
In 2024, 404 Media lined how Google surfaced apps by means of searches for “undress apps,” “greatest deepfake nudes,” and comparable phrases with promoted outcomes, regardless of Google’s advert insurance policies in opposition to the sort of content material.
Nudify apps grew to become a preferred marketplace for years, however right now, they’re extraordinarily simple to entry and are marketed on social media. In faculties, kids use nudify apps to bully classmates with disastrous outcomes for each the bullies and the victims, and faculty directors are sometimes unprepared for how one can cope with college students utilizing these wildly widespread apps.
Google and Apple didn’t instantly reply to 404 Media’s request for remark. TTP wrote that Apple declined to remark, whereas Google spokesperson Dan Jackson advised them lots of the apps recognized by TTP have been suspended. “When violations of our insurance policies are reported to us, we examine and take acceptable motion,” Jackson advised TTP.
Concerning the creator
Sam Cole is writing from the far reaches of the web, about sexuality, the grownup business, on-line tradition, and AI. She’s the creator of How Intercourse Modified the Web and the Web Modified Intercourse.

