It is forbidden to make nude photos with AI apps without permission. But it is very easy to do it anyway. Experts see people becoming victims of this and are calling for a stricter approach.
There is a worrying increase in the number of so-called nudify apps. Instagram’s parent company Meta recently warned about this. With these kinds of apps, you can erase clothes from photos, so that only invented nude images remain. The results are often very convincing, with all the consequences that entails.
With these kinds of apps, you can really harm people, says Robert Hoving of Offlimits, the expertise center in the field of online abuse. “You can put people under pressure with it. Boys are often blackmailed with the threat that the photo will be sent to family if they don’t pay. With girls, malicious people often ask for real images with sexual acts. Otherwise they will leak the fake photos, they say.”
It often doesn’t matter that these are fake images. According to Victim Support, the impact of an incriminating deepfake can be great. “You yourself know that the images are fake, but your environment and others who see the images do not know that,” says a spokesperson. “It can be a major violation of your sense of security.”
These kinds of fake nude images are not new. They have been made with programs like Photoshop for years. But the AI tools make it super simple. And they are easy to find. Via the search engine you will quickly find dozens of programs and on services like Telegram there are several bots that can undress people in photos. It often costs a few euros (and your email address) to use them, sometimes they are completely free.
The exact extent of the problem remains unclear. Victim Support does not keep separate figures on this. Offlimits can only say that it is being reported more and more often.
AI invents non-existent body
An AI model naturally does not know what a person looks like naked. But nevertheless, it is a piece of cake for the programs to remove clothing. “They use an underlying database containing thousands to millions of photos of naked people,” explains trend analyst and AI expert Sander Duivestein. “Based on that, he can suggest what someone looks like without clothes. The AI model fantasizes based on images it has been trained with.”
Hoving describes the rise of these apps as an accident in slow motion. “A cat-and-mouse game is emerging,” he says. “If one app is removed from an app store, a new one will appear. It is too easily available, especially for young people who do not yet have a developed moral compass.”
This is already happening in schools, Duivestein knows. According to him, the genie is out of the bottle. “It is always the same story with new technology: something is invented, then the impact becomes clear, and the regulations lag behind.”
Victims do not always know that their photos are being abused
Victim Support, Offlimits and Duivestein believe that something must be done. Duivestein would like to see more awareness being created by the government. Victim Support sees some benefit in making nudify apps separately punishable. “Although we realize that this is a complex issue,” says the spokesperson. “Because making something punishable does not (unfortunately) automatically mean that it will never happen again.”
Victims often don’t even know that their photos are being edited. “That’s the misery,” says Duivestein. “Someone can pick a photo from the internet, have it edited and keep it in their own collection. You may never find out.”
But even then it is not always clear how the nudify apps themselves handle those images and where they are stored. There are also shady forums where this type of photo is shared. So it can happen that the edited photos are roaming around the internet without the victim knowing it. “And then we’re only talking about photos now,” says Duivestein. “In the meantime, you can also have videos made that are hardly distinguishable from real ones.”