📰 Full Story
A Tech Transparency Project (TTP) investigation published April 15–16 found Apple’s App Store and Google Play are not only hosting so-called “nudify” apps—AI tools that strip clothing from photos or generate pornographic deepfakes—but their search, autocomplete and paid-ad systems are directing users to them.
TTP identified roughly 18 apps on Apple’s store and about 20 on Google Play; the apps have been downloaded an estimated 483 million times and earned some $122 million in lifetime revenue.
Nearly 40% of top search results for terms like “nudify,” “undress” or “AI NSFW” returned apps with nudifying capabilities, and some were rated suitable for minors.
Both platforms ran sponsored results for such apps; TTP said autocomplete suggestions amplified discovery.
Apple removed 15 apps after the report and Google suspended several, while both companies say they enforce policies banning sexual content.
The findings come amid growing legal scrutiny in the US, UK and elsewhere over explicit deepfakes and non-consensual sexual imagery, and follow earlier enforcement tussles involving high-profile AI services.







💬 Commentary