Deepfake Nudes: Everybody's naked (if you're a woman)

A new web site has made it easy to remove clothing on women from just a single digital photo. The results are supposedly amazingly realistic. No special knowledge, hardware, or software is needed, just a web browser and photos cribbed from anywhere on the internetz or your own camera gallery. They can be done in bulk, for free, and easily shared.

I know HuffPo isn’t the most credible site, but the story details are corroborated on other news sites. This article also goes in the details of the referral program used by the site to gain popularity.

“This is a really, really bleak situation,” said U.K.-based deepfake expert Henry Ajder. “The realism has improved massively,” he added, noting that deepfake technology is typically weaponized against everyday women, not just celebrities and influencers.

This reminds me of when music file sharing became an issue. Previously, bootleg tapes were certainly around but it took time and a bit of money to make them. Selling them was not a cakewalk either. But with the ability to rip CDs in bulk and then share those files via Napster, the issue blew wide open, and action was taken.

Women aren’t MP3s, so the issue is a much more sensitive one. There can be real world consequences of nude photos, faked or not, shared out in the Wild Wild Web. Loss of jobs, relationships, and suicide.

Personally, I don’t understand the obsession with wanting to see women you know personally or random women naked that didn’t consent to it. There’s billions of photos and videos depicting nudity and/or porn that are freely available. Why the need to apply some technology to make every woman naked?

Regardless, I don’t know that any laws address this in a way meaningful to the average citizen. I know that revenge porn is targeted, but I don’t know that something like deepfake nudes fall into that category.

What are your thoughts? Is it harmful? Should it be legal, or criminalized? Is it a passing fad that will fade on its own? What would you do if you found a deepfake nude of yourself out in the wild?

What fresh hell can this be?

Considering that the software can’t know what a woman really has underneath (i…e, does she have a scar on her abdomen? a birthmark? etc.), this is just software that slaps on what it assumes a woman looks like underneath.

It would most likely be used for political or social savagery or harassment or to get some celebrity, ex, neighbor or politician in trouble, not really for porn since people can always see the real thing by viewing actual porn.

Maybe this process can restore the former glory of my left boob, which has been majorly disfigured by a lumpectomy?

Nah, still a really bad idea.

50 year old episode of That Girl, when Ann’s father sees a cut and paste photo of her in a skin mag

You have it backwards: since photos and videos of anyone doing anything can be faked, you will not lose your job, relationship, or be driven to suicide, because you can now plausibly dismiss such evidence, which no longer serves as effective blackmail material.

I had been ignoring the deepfake trend for a while, until someone recently sent me a Tom Cruise deepfake that was insanely realistic. Now, I find the trend horrifying, especially when I imagine one of my kids getting deepfaked. It’s certainly going to take school bullying to new, ugly heights.

Even before this site came up, this has been a huge problem. Deepfakes would replace a porn actresses face with some famous person’s face, and then any context would get lost, meaning people would think it was real. I know there were a lot of actresses fighting this, saying that such could harm their careers and social standing.

I’d actually argue photos are less of a problem, as we’re all used to the idea of photoshop. Sure, this site makes it easier, but it’s not like someone couldn’t look for a similar looking woman and photoshop someone else’s head on them. Fake nudes have been a thing for famous people forever.

That’s not to say we shouldn’t have a discussion on whether sites like this should exist. Making it so easy that you can do it with just any woman so quickly definitely can be argued to make things worse. I can easily imagine this sort of thing being used in “revenge porn” stuff.

On the other hand, I was also going to make @DPRK’s point. If it becomes known that these fakes can be so easily made, then that makes that sort of thing actually more difficult to do. Even if it is real, people can claim it is fake. So I could see an argument that a site like this disempowers those who would engage in revenge porn.

On the other other hand, that is kinda scary, too. Not with nudes or porn specifically, but the more general trend where photographic and video proof is no longer usable, because it can be faked. I don’t think we’re to the point yet where fakes can’t be detected, but we could get there.

I’ve actually seen a proposal that we need to start training AI to be able to detect fakes. The problem I see with that is that AI do tend to get things wrong a certain percentage of the time. However, if the AI could be trained to show why it thinks something is fake, I could see that working. And perhaps having it give out the confidence levels would be useful. It would just need to not be considered definitive.

I have to say, I’m going to try this site out. I’m not going to share pictures without consent, but the computer learning part of it is fascinating, and I will want to see how good it actually is.

That said, the best way to test it would be to use photos of people who I can then also get nudes of, so I may be side stepping the main issues.

Not to mention that ads for this sort of software have been around since at least '97.

That is kind of the idea behind training a generative adversarial network: the generative network’s objective is to produce images that the discriminative network thinks are part of the true data distribution. So the same type of training that enables an AI to reliably detect fakes should also work to train it to generate increasingly undetectable fakes.

That’s right— the (basic type of this kind of plugin) software uses its imagination, as it were, to fill in the image.

^^^ This. The impact of such photos will be minimized. Katie Hill of the future can just roll her eyeballs if asked, instead of being driven from office by this kind of shit.

However, I would assume its imagination is trained based on seeing both clothed and unclothed versions of the same person. Thus it will be trying to learn what outlines and what visible-when-clothed physical characteristics correspond with what visible-when-nude characteristics.

So, on one hand, it may do better than a human would do. But, on the other hand, I would suspect the training data would be skewed towards those who are okay with being nude in public, which tends to be those with certain body types. It might be much more inaccurate on others.

That said, there is the fact that, if you take the filters off of cameras, you can see into near ultraviolet, which has been used by “x-ray” apps to see through some clothing. I doubt this AI is using that data, but not all such things are “imaginary.” I could see an app that uses the “x-ray” and then enhances it with AI.

I don’t think it will have this effect. People will still be mortified and outraged at having fake nudes of them distributed; it’s just human nature. And it’s likely that this sort of thing would befall few enough people (perhaps only a tiny percentage of women) that those women wouldn’t be able to fall back on social “everyone has it happen to them” to have any consolation.

Deepfake nudes aren’t that concerning to me. The human body is nothing to be ashamed about and it’ll be nice for the puritanical strain in the US to be diminished. I’m more concerned with Deepfake video and audio making evidence less and less reliable with regards to criminal behavior. At some point you might not be able to trust any evidence.

I disagree. We’re living in woke times (I don’t mean that sarcastically or negatively) we’ve seen the outpouring of sympathy towards celebs that have had their actual nudes leaked to the press.

I work for a large company. Some idiot disgruntled ex bf decided to share a police mug shot of his ex gf that worked for the same company she did. Not only did she NOT get fired, the asshole ex bf did along with about 7 other people that shared the photo of the mugshot on company email.

Oh, and the police mug shot? Yeah, she was falsely arrested AND brutalized by police. Charges were dropped but she never decided to sue bc she didn’t want to relive the nightmare.

I don’t think F-500 companies are so quick to fire anymore just to avoid a PR nightmare. I think the trend is they are jumping on the woke bandwagon. At least I HOPE that’s the trend.

I don’t know what “nudifier” HuffPost is referring to but I just tried the one mentioned by Metro, using the same image as in the ad from '97 mentioned above. The result was most definitely not realistic.

I expect at some point it will be common for cameras to cryptographically sign images and videos. It will be provable that this camera captured that exact video at what the camera believed was the time and location stored in the metadata. (“Provable” to the extent of the security of the implementation of the cryptographic mechanism.)

At some point in the future faked nudes may be so easy to produce and so common that they are a non-event. We’re not there yet. We’re heading into a pit of overlap, where fakes are easy to produce, but there is still significant stigma associated with them.

And this doesn’t have to be limited to just nudes, of course. Replace nudes with pictures of people in black-face or attending the “wrong” type of political rally, and the results could be just as damaging.

Yep, this is a great worry down the road. The same world where you can say anything embarassing is likely faked, is the one where anything actually incriminating can be faked…

Eh, yes and no. For as long as society has existed, eyewitness testimony has been a major form of evidence for crimes. Even though witnesses can both lie and be honestly mistaken. The witness says “I saw him do it!”, and the officers of the court ask, essentially, “Are you telling the truth? And are you sure?”, and then judge how credible the answers are.

Well, nowadays, a security camera can capture footage of a person committing a crime. And the owners of that camera just might be able to fake the footage so it looks like someone else, but that’s still a lot harder than the lying (or honest mistakes) that a human witness can do. And you can still call the owners of the camera to the stand, and ask them “Are you lying?”, and judge their credibility. That said…

This wouldn’t work. If nothing else, you could fake your video, show that on a screen, and point the camera at the screen and re-record it. Or if you wanted higher quality, you could take apart the camera, and plug the output of your faked video in in place of the sensor. And that’s all without getting a leaked private key for the camera, which is bound to happen in at least a few cases.

I guess those who pay for the service – between $10 & $40 – get access to higher quality nudification.