DeepNude An AI App That “Undressed” Women Show How Harmful DeepFakes Is

We all know how harmful the deepfake an AI-based technology used to alter video content can be and now this recent application of it called ‘DeepNudes’ an app that “undressed” photos of women prove it again.

The DeepNude software uses a photo of a clothed person and with that, it creates a new realistic, naked image of that same person by using generative adversarial networks, or GANs.

DeepNude An AI App That “Undressed” Women Show How Harmful DeepFakes Is

This app was priced at $50 (£40) and was getting served through the website for Windows and Linux operating system. The program was available in two versions a free version and paid one.

In the free version of the app, the output images are partially covered with a large watermark. In a paid version, which costs, the watermark is removed, but a stamp that says “FAKE” is placed in the one corner.

DeepNude only works on images of women, which means if we give it a picture of a man it generated naked images of the female body, the reason for that I think that the software was trained on female images only.

The DeepNude was trained over more than 10,000 nude photos of women that readily available on the internet to teach software where clothes are in an image, mask them by matching skin tone, lighting, and shadows and then fill in estimated physical features.

In an email, the anonymous creator of DeepNude, who requested to go by the name Alberto, stated that the DeepNude is based on a software called pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017.

Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge dataset of images. These networks learn the mapping from the input image to output image and also learn a loss function to train this mapping.

Pix2pix is effective at synthesizing photos from label maps, reconstructing objects from edge maps, and colorizing images, among other tasks. DeepNudes uses these features of Pix2pix to create fake nude images.

In DeepNude multiple networks are used because each one has a different task like locate the clothes, mask the clothes, speculate anatomical positions and render it.

Alberto said, “All this makes processing slow (30 seconds in a normal computer), but this can be improved and can be accelerated in the future.”

If we talk about DeepNudes results, then it considerably varies according to the input image, for example, it appears to work best on images where the person is already showing a lot of skin.

DeepNude An AI App That “Undressed” Women Show How Harmful DeepFakes Is

This helps the software to build the color and tone of the body in an accurate way, the quality of the image is also important as it helps the algorithm to work with shades.

According to Motherboard, the results vary dramatically, but when fed a well lit, high-resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic.

The software’s algorithm accurately predicts and fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.

But it’s not flawless. Most images, and low-resolution images especially, produced some visual artifacts.

DeepNude failed completely in many aspects, such as some photographs that used weird angles, lighting, or clothing that seem to throw off the neural network it uses.

DeepNude An AI App That “Undressed” Women Show How Harmful DeepFakes Is 1

When Motherboard fed it an image of the cartoon character Jessica Rabbit, it distorted and destroyed the image altogether, throwing stray nipples into a blob of a figure.

DeepNude was started on June 23. Alberto its creator said he was inspired to create DeepNude by ads for gadgets like X-Ray glasses that he saw while browsing magazines from the 1960s and 70s, which he had access to during his childhood.

“Like everyone, I was fascinated by the idea that they could really exist and this memory remained,” he said. “About two years ago I discovered the potential of AI and started studying the basics.

When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one.

Eureka. I realized that x-ray glasses are possible! Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results.”

Thursday afternoon, the DeepNude twitter account announced that the app was dead. No other versions will be released and no one else would be granted to use the app.

“We created this project for users’ entertainment months ago,” he wrote in a statement attached to a tweet. “We thought we were selling a few sales every month in a controlled manner… We never thought it would become viral and we would not be able to control traffic.”

Alberto said that he had grappled with questions of morality and ethical use of this app. “Is this right? Can it hurt someone?” he said he asked himself.

“I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of the tutorial),” he said. If the technology is out there, he reasoned, someone would eventually create this.

Since then, according to the statement, he’s decided that he didn’t want to be the one responsible for this technology.

“We don’t want to make money this way,” the statement said. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones to sell it.” He claimed that he’s just a “technology enthusiast,” motivated by curiosity and a desire to learn.

More in AI

Adobe’s New AI Can Automatically Detect Deepfakes and Photoshopped Images

Google New Project Combines Satellites With AI To Fight Against Pollution

Olympic Of Artificial Intelligence Is All Set To Start In June 2019

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.