Hero Image

Instagram is profiting from ads promoting nonconsensual nude image creation



Instagram is profiting from ads promoting nonconsensual nude image creation
23 Apr 2024


Instagram, a Meta subsidiary, is profiting from advertisements that encourage the creation of nonconsensual nude images through AI-powered apps.

These ads are not hidden but openly promoted on social media platforms.

Despite several such ads being removed as indicated by Meta's Ad Library, many continue to lure users into generating explicit content.

One disturbing ad featured a photo of Kim Kardashian with the caption "Undress any girl for free. Try It."


Clicking on ad redirects users to a nude image generator
Functionality


Clicking on the aforementioned ad redirects users to a popular web-based app that generates nonconsensual nude images.

Users can upload a photo of a real person and pay for the app to create an image depicting the person as nude.

Multiple versions of this ad were active on Instagram and Facebook between March 13 and 19, according to Meta's Ad Library.


The apps were labeled as 'art generators' on app stores
Loophole


Another ad showcased an AI-created split image of a woman, clothed on one side and nude on the other, with the text "Try it now. Any clothing DELETE. 1. Erase anywhere. 2. Output image! This APP can delete..."

These ads and their variations were active across Facebook, Instagram, and Facebook Messenger, from April 10-20.

Clicking these ads would lead users to the Apple App Store where these apps are labeled as "art generators," cleverly bypassing policies against promoting adult content.


Meta has only removed some of these ads
Enforcement


Despite these apps advertising their nonconsensual deepfake porn creation capabilities on various sites, Meta has only removed some of these ads after they were reported by journalists.

The company continues to struggle with enforcing its policies regarding who can purchase ads on its platforms.

"Meta does not allow ads that contain adult content and when we identify violating ads we work quickly to remove them, as we're doing here," stated Meta Spokesperson Daniel Roberts.


The usage of these apps is spreading
App proliferation


Despite negative reviews and some apps failing to generate nude images as advertised, the usage of these "undress" apps has spread.

This trend has even led to the arrest of two middle school students in Florida.

These apps are readily available on Google and Apple app stores, making nonconsensual AI-generated porn easier to create.

Other social media platforms like TikTok and X have also been reported to promote these types of apps.

READ ON APP