AI nude apps: Apple cracks down on Nudify services

Several apps that undress people in photos at the touch of a button made it onto the App Store undetected. According to a report, Apple has reacted decisively.

Save to Pocket listen Print view
Der App Store auf dem iPhone

(Bild: tre / Mac & i)

2 min. read
This article was originally published in German and has been automatically translated.

Five years ago, an app made headlines: Deep Nude used a machine learning algorithm to virtually undress dressed people in photos. In the meantime, nudify apps are no longer an issue from a technical perspective given the general advances in artificial intelligence. Their socially harmful effects are all the more so. Apple has now removed three apps from the App Store.

The online portal 404Media claims to have initiated the move. It claims to have alerted Apple and Google to the existence of several nude AI apps, which were subsequently deleted. It remains unclear why Apple did not deny access to the programs in the App Review from the outset. According to the report, the providers of the apps apparently left no doubt as to what they could be used for, at least in ads placed on Instagram. In the app store descriptions submitted by the developers to Apple, the intended use was probably much more tame. This may be why the auditors did not realize what they were actually dealing with.

For the providers of such apps, they are apparently a lucrative business model. The analysis company Graphika examined a total of 34 providers in December 2023 and found that they had over 24 million visitors in September alone. Most services offer these as a freemium model, where a few images can be created free of charge as a trial and others then cost money.

Unsurprisingly, such apps are usually used against the will of the people being photographed. Their use has a particularly bad effect when such images, which are sometimes not even immediately recognizable as deep fakes, are distributed. In a small Spanish town, for example, AI-generated nude images of schoolgirls generated by teenagers were circulated.

With a view to adolescents, Apple itself uses protective functions in the operating system to prevent incoming nude images from simply being displayed or simply being sent via iMessage. The App Review Guidelines, which app developers must adhere to, can be used to ban AI nude apps in several respects, as they exclude false information, obviously sexual material and discriminatory content.

(mki)