Hero Image

Apple removes AI nude apps from App Store

Apple took a firmer stance against harmful apps by removing several from the App Store . These apps, powered by AI image generation, allowed users to create nonconsensual nude images.

Generative AI is a powerful tool for creative fields, but it can also be misused. This technology has become a breeding ground for deepfakes and revenge porn.


Apple had previously been criticized for its inaction on these apps. A 2022 report exposed apps that disguised their true capabilities on the App Store. In a report by 404 Media , Apple was informed of a number of AI image generation apps available in the App Store. Specifically, they were apps that were marketed as able to create nonconsensual nude images.

While advertised innocently, they promoted deepfake porn features on adult websites. Despite warnings, Apple and Google only requested the apps stop advertising on those sites, allowing them to stay on their stores.

According to a report in Apple Insider, the apps offered features such as face-swaps on adult images. Others were marketed as "Undress" apps, virtually stripping the clothing off of subjects of otherwise innocuous photos.

After being alerted to the apps and related advertising, Apple removed three of them from the App Store. Google similarly removed apps from the Play Store. Apple's decision to finally remove the offending apps from the App Store is the latest move Apple made to try and keep its AI dealings as above board as possible.

The report's investigation previously raised the issue that Instagram advertises the apps through Meta 's Ad Library, Once the ads were flagged, Meta deleted them.

This action comes alongside Apple's other efforts to ensure responsible AI practices. They prioritize privacy-preserving methods for training their AI and avoid copyright infringement issues. This contrasts with companies like Microsoft and OpenAI, which are facing lawsuits for using copyrighted materials without permission.

READ ON APP