



The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.īut it's not flawless. The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic.
What deepfake app makes nudes of anyone? skin#
Motherboard tested it on more than a dozen images of women and men, in varying states of dress-fully clothed to string bikinis-and a variety of skin tones. (Cropping out the "fake" stamp or removing it with Photoshop would be very easy.) In a paid version, which costs $50, the watermark is removed, but a stamp that says "FAKE" is placed in the upper-left corner. In the free version of the app, the output images are partially covered with a large watermark. It installed and launched like any other Windows application and didn't require technical expertise to use. Motherboard downloaded the application and tested it on a Windows machine.
What deepfake app makes nudes of anyone? software#
“As a deepfake victim said to me-it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”ĭeepNude launched as a website that shows a sample of how the software works and downloadable Windows and Linux application on June 23. “Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” she said. This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard. This tech should not be available to the public." "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. "This is absolutely terrifying," Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, told Motherboard. DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies. DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes. But the most devastating use of deepfakes has always been in how they're used against women: whether to experiment with the technology using images without women's consent, or maliciously spreading nonconsensual porn on the internet. Since Motherboard discovered deepfakes in late 2017, the media and politicians focused on the dangers they pose as a disinformation tool.
