What to do about the Deepfake technology?

To be fair, assuming the worst is about 95% accurate in most applications. And nobody has ever been proved wrong saying “it’ll all end in tears”.

Christ, I think I need a drink.

Not in its current state. The really convincing videos are only a few seconds long. And they’re of popular celebrities where there are thousands of clear, well lit and high definition images available plus tons of high def video to cull more images from using every angle and facial expression. And it only really works if the subject is the only face in the frame. Trying to make a ten minute Deepfake video of your next door neighbor buying illegal panda pelts will be a lot less convincing.

Eventually it may require forensic investigation of video being used as evidence to determine if it’s legitimate or has been altered if someone is arguing against its legitimacy. But photos are still considered evidence and we’ve been doctoring those since the dawn of photography and only gotten better at it.

Is there Deepfake Beefcake?

It seems as if there ought to be Deepfake Beefcake. It is porn, after all.
I just like saying Deepfake Beefcake

Deepfake Beefcake
Deepfake Beefcake
Deepfake Beefcake
Deepfake Beefcake

Is there a way to tamper-proof a video?

I can’t imagine. You take a target video and make a bunch of images out of the target body’s face. Then you take your thousands of images of the celebrity (for purposes of example) from different angles, lighting, etc and let the computer “train” it for 12-24 hours and figure out how to mask the celebrity’s face over the target’s. It’s much easier if the two people involved look similar to begin with. Then the software creates a new video with the celebrity’s image masked over the target’s face with varying degrees of success. I can’t imagine a way to prevent using a video that’s in the standard formats.

I’ll note that I’m not advocating for the technology and I’ve never used it. I don’t even have an Nvidia GPU (which is required, AMD won’t work due to the programming language used). In “Streisand Effect” fashion, I was hit one morning with multiple news articles about how Discord shut down a channel related to it and found it interesting so read up a bit about it.

All this reminds me of the “Running Man” movie with Arnold Schwarzenegger.

Then someone owes you an apology.

What’s the source of the video? Did you download it off the internet somewhere? Or was it taken by cops? Or by a store surveillance camera?

Then whatever video image is shown in court is assessed the same way any other evidence is assessed.

Did the cops fake the video? Is there any evidence that they faked the video? Cops can testify, “I saw the defendant pull out a gun and shoot his wife”. Evidence like that is very easily faked, all you have to do is say it happened. So easily created fake videos are just more of the same.

All that changes is that juries won’t take videos at face value anymore. But they don’t take still pictures at face value, because photos have been fakeable for years. Just like documents and eyewitness testimony.

Reddit has banned the Deepfakes subreddit for content policy violations. I imagine now they’ll wind up on 4chan or some other less reputable corner of the internet.

I’m not sure there is a silver bullet. We have to understand that it’s part of the world now, like any other kind of forgery, and adjust our levels of trust accordingly. It will become easier for innocent people to be tricked, so people will need to approach life with less innocence.

Right; it’s like trying to ban copied music. It’s swimming against the tide.
Even if for now we could block videos like this they’ll be a point where they can mod the faces such that there’s plausible deniability that it was sourced from celeb photos.

Better we just understand that soon video will be like photos; a single, out of context video may not be proof of much.

A pertinent and disturbing take from Lawfare. It begins with “We are truly fucked.”.

Deep Fakes: A Looming Crisis for National Security, Democracy and Privacy?

:eek: How long until they make it here?

I’ve seen real time video replacement, and I think I’ve seen real time audio replacement as well.

Here’s one example:

My god, what if they start messing with our Smilies?

Lawfare is an excellent site and that article is fantastic.

I remember the Motherboard story it cribbed the phrase from, tho: We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now which was the follow-up from an earlier article, AI-Assisted Fake Porn Is Here and We’re All Fucked; both are worthwhile reading IMO.

Could something like blockchain be used to provide continuity or chain of evidence?

I read the article, and the more serious problem to me, by far, is that people will start to distrust genuine evidence. I just can’t see fake video being absolutely trusted indefinitely. Yeah, yeah, there’s fake news, but I can definitely see liberals using it to troll conservatives much more than I can with news. Besides, as I believe has been pointed out before, Photoshop exists, and I don’t think people generally unquestioningly trust photographs these days. Whether video is different is, of course, the point of this thread, but I don’t see it as much of a slam dunk as Lawfare does, at least in that aspect.