![]()
Apple and Google are helping users find apps that create deepfake nude images, according to a new investigation.
So-called “nudify” apps use AI to alter photographs of real people, making them appear naked, placing them in pornographic videos, or turning them into sexually explicit chatbots. According to a report published on Wednesday by the Tech Transparency Project (TTP), Apple and Google play a significant role in the spread of these tools.
TTP first reported in January that both the Apple App Store and Google Play hosted dozens of apps designed to digitally remove clothing from photographs of women. The latest investigation found that the platforms’ own search and advertising systems direct users toward these apps, increasing their visibility.
Both Apple and Google have policies that prohibit apps enabling the creation of nonconsensual sexualized images. However, the TTP report found that not only do such apps remain available, the search tools on both platforms actively point users to them. According to the investigation, Apple and Google displayed ads for nudify apps within search results and suggested related terms through autocomplete.
Bloomberg News reports that searching for terms like “nudify” and “undress” in the companies’ app stores provides access to software that can alter photographs of celebrities and others to make them appear nude or partially undressed. The companies also run ads for similar apps in those results.
TTP identified 18 nudify apps on Apple’s platform and 20 on Google Play. Some were marketed with sexual images, while others were not explicitly advertised that way but could still be used to create deepfakes. Data compiled by a mobile analytics firm shows the apps identified by TTP have been downloaded 483 million times and have generated more than $122 million in lifetime revenue. The investigation also found 31 nudify apps rated as suitable for minors, a notable finding amid a rise in sexual deepfake incidents in schools.
After TTP and Bloomberg News shared the report’s findings, Apple removed 15 apps and Google removed seven.
The report comes as lawmakers in Minnesota are reportedly close to banning AI nudification apps outright. In the U.K., the Children’s Commissioner has also called for an immediate ban on such apps, citing concerns that they enable “deepfake sexual abuse of children.”
Image credits: Header photo licensed via Depositphotos.

















