Apple removes apps after report raises concerns over App Store search, ads boosting 'nudify' tools

Newspoint
A fresh investigation by the Tech Transparency Project ( TTP ) suggests that Apple’s own search and advertising systems may be directing users towards apps that generate deepfake nude images of women. The report points to promoted results and autocomplete suggestions that surface such apps, sometimes right at the top, as per a report by 9to5Mac.

In some cases, basic searches appear to do the heavy lifting. A query for “deepfake,” for instance, returned a sponsored listing for a face-swapping app. While the feature itself is not new, testing showed it could place the face of a clothed woman onto the body of a topless one after a short in-app ad.
Hero Image

Search suggestions also appear to play a role. Typing “AI NS” prompted the App Store to recommend “image to video ai nsfw,” which in turn surfaced several nudify-style apps among the top results. The pathway isn’t hidden—it’s built into the search flow, step by step.

Another app flagged in the report, surfaced through a “face swap” query, allowed users to upload images and swap faces without visible restrictions during testing. The process was straightforward, with no clear barriers at the point of use.

TTP also contacted some developers. In one case, a developer said they were using Grok for image generation but claimed they were unaware it could produce explicit outputs. The developer said moderation settings would be tightened.

Nudify apps are a problem on both Apple and Google app storesApple declined to comment on the findings. However, following the report, most of the apps identified by TTP were removed from the App Store, according to the report. The broader issue, though, remains.

The report adds that similar patterns were observed on the Google Play Store . As per the report, Apple and Google are still failing to prevent nudify apps from appearing in their app stores, some of which appear as suitable for minors. The group found that nearly 40% of the top 10 apps returned for searches such as “nudify,” “undress,” and “deepnude” could “render women nude or scantily clad.”

Some of these apps were also listed with age ratings that suggested suitability for younger users.

TTP’s findings indicate that despite periodic removals, app store safeguards are still allowing such apps to surface through standard search and discovery features.

“Another App Store search for the term “face swap” yielded an ad for app called AI Face Swap. The app offers preset face swap templates and allows users to swap faces on images they upload themselves. TTP uploaded a photo of a woman in a blue sweater standing in a living room and an image of topless woman, and the app swapped their faces with no restrictions,” said the report.