The rise and fall of a controversial app developed to completely "undress" women

Someone thought it was a good idea to allow men to virtually undress women.
Someone thought it was a good idea to allow men to virtually undress women.

The app in question was called DeepNude, and it allowed its users to “undress” anyone based on an uploaded image of the person.

While the error in the app developers' ways was realised quickly enough, and the app has been removed from all platforms, it’s more important now than it ever was to speak about the dangers of creating platforms that breed men who cannot control their desire or lust toward women. 

READ MORE: Stellenbosch gym masturbator shows how easily men get away with sexual harassment

What’s maybe worse is that DeepNude was developed for the purpose of undressing women, but did not work on men.

This fact alone begs one to consider the kinds of thoughts going through certain men’s minds when they objectify women. Clearly a woman’s best interest, let alone safety, is not being taken into consideration in such instances.

READ MORE: Google keeps an app that allows guardians to track Saudi women's whereabouts and limit their travel

A woman’s privacy and obvious wishes to not want to be seen naked by a complete stranger regardless of whether it was real or not, were completely disregarded.

And the thought is quite scary.

The nude picture may not have been real, but it displayed a woman’s naked body in vivid detail.

What happens when the image alone is not enough? Does this not prompt the viewer to go after the “real thing”?

While it is a strong assumption to make, we cannot ignore the harsh truth about the rate at which women are being raped, abused and even murdered in today’s society. And this was not even the only concern.

Washington Post reports that another concern raised by these DeepFake apps is that people will now be able to make false claims about certain activities.  

READ MORE: This woman developed an app that can help locate users who may be in danger of sexual assault

Before technology was as advanced as it currently is, it was easier to spot a photoshopped or edited picture or video, but DeepFakes were designed to create the most realistic impressions using AI, which is a growing concern.

“Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content,” Danielle Citron, a law professor at the University of Maryland, testified before a House committee on the dangers of DeepFakes this month. 

“Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage,” said Danielle according to this Washington post article.

READ MORE: The 'anti-groping app' has become a hit in Japan - could it be something that would work in South Africa?

Meanwhile, the app developers have issued a statement after the app was shut down, saying that “the world was not ready” for DeepNude.

And as a woman who lives in constant fear of being objectified and hypersexualised, physically abused, raped or murdered, I don’t think the world will ever be ready for such a distasteful app.

Sign up to W24’s newsletters so you don't miss out on any of our hot stories and giveaways.

We live in a world where facts and fiction get blurred
In times of uncertainty you need journalism you can trust. For only R75 per month, you have access to a world of in-depth analyses, investigative journalism, top opinions and a range of features. Journalism strengthens democracy. Invest in the future today.
Subscribe to News24
Voting Booth
Please select an option Oops! Something went wrong, please try again later.
Yes, I believed it was authentic
7% - 364 votes
Yes, I didn't want to spend that much money on the original item
20% - 1056 votes
No, I always shop at reputable stores
13% - 696 votes
No, I don't wear designer clothing
60% - 3172 votes