This is Jim Carrey’s face digitally overlaid on Jack Nicholson’s performance in The Shining. It’s entirely computer generated aside from the input data.
Here’s more information about Deepfake, the technology used to create this.
This is Jim Carrey’s face digitally overlaid on Jack Nicholson’s performance in The Shining. It’s entirely computer generated aside from the input data.
Here’s more information about Deepfake, the technology used to create this.
Wow. All so can say is — the porn world just changed forever.
The Corridor Crew did something similar (but even more impressive IMO) just last month; I posted about it in the What’s new, Atlas? thread.
Many states are outlawing this use of technology, primarily for the revenge porn angle.
This is better than the previous gen that was reported in the last ~2 years or so. Those were easier to spot because they avoided scenes where the angle of the face suddenly changed. You can see that here if you know what you’re looking for, but I don’t think I could have spotted it if I hadn’t been told. I mean, if I didn’t know it should have been Jack Nicholson.
It’s pretty damn clever, is what I’m saying.
The 2020 election is going to be fun. Facebook is going to be filled with deepfake videos of the candidates doing and saying whatever their opposition believes will rile up their voters. I already know that my FIL will be sharing every single one.
I assume that Jack Nicholson and Jim Carrey were chosen because they have similar facial structure, and this would’t work nearly as well for any random two people. So if you wanted a fake video of your political opponent, you’d have to start by locating a similar-looking person and having them act out your scene.
I think they’re way ahead of you.
Hasn’t SnapChat been doing this for quite a while already?
The Snapchat filter. This one cracks me up.
Nope. It helps because the computer has less alterations to do, but that won’t be an issue with better machines and better trained AIs. The Tom Cruise video was made after just 3 days of training with a very limited number of reference images. Imagine a machine that trained for 300 days on a virtually unlimited number of images: you won’t be able to tell it isn’t real.
OK, nothing hugely new, but I’m not going to turn down a chance to use that title.
It’s Bruce Lee’s face deepfaked onto Neo during the Neo-Morpheus sparring scene in The Matrix.
It’s old news there. This story is from eighteen months ago. But there’s a huge difference between a 5-15 second clip with a bajillion sources for data and limited angles versus faking someone across several minutes and multiple angles, expressions, etc.
The flip side is that legitimate video evidence will become completely useless absent the ability of ordinary folks to analyze the video for tampering. So 45 really could shoot someone on 5th Avenue in front of a video camera, and later just call it fake.
This world gets scarier every day.
Just remember.
Still with us ZonexandScout? Any thoughts on the current state of the art?
ZonexandScout just posted today, but there’s one point he made I want to amplify and elaborate on: Video forensics is just like document forensics in that provenance matters. Where did the file come from? What can we deduce about where and when it would have been made? Is it consistent with other known facts? If someone is alleged to have signed a document in Paris two weeks ago when they’ve been in Berlin for the past month, we know, beyond any reasonable doubt, that the Paris document is a forgery no matter how good it looks. Similarly, if a video just pops up out of nowhere and can’t account for its own existence, that’s a big strike against it. Every thing, every piece of media and every physical object, has a story attached to it, brief as it may be, and unless a good, solid, consistent story can be created for a video, the video is likely fake even if it looks very real.
The problem with Deepfake is the increased ease of creating fakes which will stand up to minimal scrutiny. You can’t Deepfake a consistent story which will stand up to investigation of the people alleged to have been involved, but you don’t need to if your fake video makes some political faction yell “I KNEW IT!” and share it because it’s too good not to. Some groups, like QAnon, are so deep in that kind of thinking that they’ll put faith in absolute idiocy as long as it tickles their preconceptions. Others might need more of an excuse to believe, but are ultimately just as credulous.
The lesson is to be extra willing to disbelieve things you want to be true. Wait for good news sources to report on it, and attempt to poke holes in any good story.
[quote=“Wesley_Clark, post:17, topic:836803”]
I personally like this video the guy did of a deep fake of Stallone playing the terminator.
[/QUOTE]Stallone ain’t that tall.