Controversial DeepNude smartphone app shuts down

By Staff Writer | 30 Jun 2019 at 18:01hrs
DeepNude
A controversial mobile app which uses neural networks to create fake nude pictures of women has shut down shortly after gaining widespread public attention.

The app, named DeepNude, generates realistic-looking nude images of women by using any picture as input and running the image through its AI algorithm.

The free version of the app places a watermark over generated images which clearly shows that they are fake, but the paid version of the app only includes a small watermark in the corner of the processed image.

Motherboard first discovered the application yesterday, and the story was soon picked up by other publications.

Following public outcry over the potential abuse of the app, DeepNude shut down its service, citing the high potential for misuse of the application.

"Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high," the app's developer said in a tweet.

"Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones who sell it."

The developer stated that downloading the software from other sources or sharing it would violate the terms of the DeepNude website.

"The world is not yet ready for DeepNude," the developer said.

LATEST NEWS

PARTNER CONTENT

WhatsApp Newsletter

Follow us

Latest Headlines