This Horrifying App Undresses a Photo of Any Woman With a Single Click

Credit to Author: Samantha Cole| Date: Thu, 27 Jun 2019 11:48:42 +0000

A programmer created an application that uses neural networks to remove clothing from the images of women, making them look realistically nude.

The software, called DeepNude, uses a photo of a clothed person and creates a new, naked image of that same person. It swaps clothes for naked breasts and a vulva, and only works on images of women. When Motherboard tried using an image of a man, it replaced his pants with a vulva. While DeepNude works with varying levels of success on images of fully clothed women, it appears to work best on images where the person is already showing a lot of skin. We tested the app on dozens of photos and got the most convincing results on high resolution images from Sports Illustrated Swimsuit issues.

Since Motherboard discovered deepfakes in late 2017, the media and politicians focused on the dangers they pose as a disinformation tool. But the most devastating use of deepfakes has always been in how they’re used against women: whether to experiment with the technology using images without women’s consent, or maliciously spreading nonconsensual porn on the internet. DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes. DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies.

“This is absolutely terrifying,” Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, told Motherboard. “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.”

This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard.

“Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” she said. “As a deepfake victim said to me—it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”

tyra

An image of Tyra Banks, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

DeepNude launched as a website that shows a sample of how the software works and downloadable Windows and Linux application on June 23.

Motherboard downloaded the application and tested it on a Windows machine. It installed and launched like any other Windows application and didn’t require technical expertise to use. In the free version of the app, the output images are partially covered with a large watermark. In a paid version, which costs $50, the watermark is removed, but a stamp that says “FAKE” is placed in the upper-left corner. (Cropping out the “fake” stamp or removing it with Photoshop would be very easy.)

Motherboard tested it on more than a dozen images of women and men, in varying states of dress—fully clothed to string bikinis—and a variety of skin tones. The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic. The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.

natalie

An image of Natalie Portman, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

But it’s not flawless. Most images, and low-resolution images especially, produced some visual artifacts. DeepNude failed entirely with some photographs that used weird angles, lighting, or clothing that seem to throw off the neural network it uses. When we fed it an image of the cartoon character Jessica Rabbit, it distorted and destroyed the image altogether, throwing stray nipples into a blob of a figure.

In an email, the anonymous creator of DeepNude, who requested to go by the name Alberto, told Motherboard that the software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge dataset of images—in the case of DeepNude, more than 10,000 nude photos of women, the programmer said—and then trying to improve against itself. This algorithm is similar to what’s used in deepfake videos, and what self-driving cars use to “imagine” road scenarios.

The algorithm only works with women, Alberto said, because images of nude women are easier to find online—but he’s hoping to create a male version, too.

“The networks are multiple, because each one has a different task: locate the clothes. Mask the clothes. Speculate anatomical positions. Render it,” he said. “All this makes processing slow (30 seconds in a normal computer), but this can be improved and accelerated in the future.”

Deepfake videos, by comparison, take hours or days to render a believable face-swapped video. For even a skilled editor, manually using Photoshop to realistically change a clothed portrait to nude would take several minutes.

Why DeepNude was created

Alberto said he was inspired to create DeepNude by ads for gadgets like X-Ray glasses that he saw while browsing magazines from the 1960s and 70s, which he had access to during his childhood. The logo for DeepNude, a man wearing spiral glasses, is an homage to those ads.

“Like everyone, I was fascinated by the idea that they could really exist and this memory remained,” he said. “About two years ago I discovered the potential of AI and started studying the basics. When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one. Eureka. I realized that x-ray glasses are possible! Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results.”

Alberto said he continued to experiment out of “fun” and curiosity.

“I’m not a voyeur, I’m a technology enthusiast,” he said. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That’s why I created DeepNude.”

Unprompted, he said he’s always asked himself whether the program should have ever been made: “Is this right? Can it hurt someone?” he asked.

gadot

An image of Gal Gadot, before (left) and after (right) using the DeepNude app. Censoring via Motherboard
kim

An image of Kim Kardashian, before (left) and after (right) using the DeepNude app. Censoring via Motherboard

“I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial),” he said, noting that DeepNude doesn’t transmit images itself, only creates them and allows the user to do what they will with the results.

“I also said to myself: the technology is ready (within everyone’s reach),” he said. “So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year.”

Better, and much worse, than deepfakes

In the year and a half since Motherboard discovered deepfakes on Reddit, the machine learning technology it employs has moved at breakneck speed. Algorithmic face-swaps have gone from requiring hundreds of images and days of processing time in late 2017, to requiring only a handful of images, or even just text inputs, and a few hours of time, in recent months.

Read more: It’s Getting Way Too Easy to Create Fake Videos of People’s Faces

Motherboard showed the DeepNude application to Hany Farid, a computer-science professor at UC Berkeley who has become a widely-cited expert on the digital forensics of deepfakes. Farid was shocked at this development, and the ease at which it can be done.

“We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways,” Farid said. “In addition, social media platforms are going to have to think more carefully about how to define and enforce rules surrounding this content. And, our legislators are going to have to think about how to thoughtfully regulate in this space.”

Deepfakes have become a widespread, international phenomenon, but platform moderation and legislation so far has failed to keep up with this fast-moving technology. In the meantime, women are victimized by deepfakes and left behind for a more political, US-centric political narrative. Though deepfakes have been weaponized most often against unconsenting women, most headlines and political fear of them have focused on their fake news potential.

Even bills like the DEEPFAKES Accountability Act, introduced earlier this month, aren’t enough to stop this technology from hurting real people.

“It’s a real bind—deepfakes defy most state revenge porn laws because it’s not the victim’s own nudity depicted, but also our federal laws protect the companies and social media platforms where it proliferates,” attorney Carrie Goldberg, whose law firm specializes in revenge porn, told Motherboard. “It’s incumbent on the public to avoid consumption of what we call at my office humili-porn. Whether it’s revenge porn or deepfakes, don’t click or link or share or like! That’s how these sites make money. People need to stop letting their Id drive internet use and use the internet ethically and conscientiously.”

DeepNude is easier to use, and more easily accessible than deepfakes have ever been. Whereas deepfakes require a lot of technical expertise, huge datasets, and access to expensive graphics cards, DeepNude is a consumer-facing app that is easier to install than most video games that can produce a believable nude in 30 seconds with the click of a single button.

Emanuel Maiberg contributed reporting to this article.

This article originally appeared on VICE US.

http://www.vice.com/en_ca/rss