What to do about the Deepfake technology?

But that is not applying critical thinking skills-That is just giving up and assuming the worst.

All I know is, it is not considered an acceptable substitute for the Prime Minister having full unsimulated sexual intercourse with a pig.

I basically agree with watchwolf49’s approach, with the caveat that rather than saying “I assume both are fake” I would say “I make no judgment as to which is fake without further information.”

That’s critical thinking–learning to have awareness enough of one’s information sources and thought processes to know when to withhold judgment and how to get into a better position such that judgment can finally be passed.

I’m not sure what a couple blurry low-res pictures have to do with anything. Heck, people can hand paint more realistic images than those. I think by “critical thinking” the intent is (in part) “Hey, Emma Watson probably didn’t participate in a professionally filmed gang bang with ten dudes therefore this video probably isn’t really her”. On the news side, there may be more effort into using technology to validate video’s authenticity before publishing it or more reliance on corroborating accounts, video of the same event from different angles, etc. Part of me thinks this could be dangerous in “hidden video” situations where no one else was present but then we already have half the country still convinced that ACORN was dealing with pimps based on heavily-edited and doctored videos so I’m not sure what changes.

On the sex end of things, maybe the change will have to be social and we stop clutching our pearls at the idea of someone having sex, masturbating, etc. Not saying that it mitigates the issues of consent or right to your own images but I don’t think there’s a technological fix that’ll stop people from making hyper-realistic porn of celebrity figures and it’ll only keep getting easier as technology advances.

Well, actually, it was more like “critical thinking suggests this image of the president handing the nuclear codes over to the Russian president is probably a fake.”

The “be less obsessed with celebrities” part relates to suggestions that Emma Watson hosted a gangbang and such. Even if it turned out to be true… why should anyone other than the participants care, let alone someone who’s never met and will likely never meet Watson?

A dead pig, mind.

So if there was squeeling, it wasn’t coming from the pig?

I think it’s the one on the right. Where the sky meets the top of the trees just doesn’t look right to me.

Point is, soon, you won’t be able to tell - technology is advancing to the point where it will be impossible to differentiate between images that are real vs those that are the entirely fictional (but plausible) work of a computer - and there’s nothing that can reasonably be done to stop that development taking place.

There will be no “I can tell from the pixels”. It will be possible to fake any image or video.

Which means that we will simply not be able to trust video evidence to the extent we do currently.

Invest heavily.

So what do we do about it? Do we have to trust journalistic integrity or do we demand police-style chain of custody level evidence?

In porn? :slight_smile:

I think a person should have the right to sue if someone uses their image in a way they dont agree to.

Well, maybe, but the way I was depicted in Dopers Do Dallas was kinda flattering.

Which is a reason why this particular program is arguably a good thing–the technology to do it is going to exist one way or another, making it commonly known and available that it cam be done is good.

How to combat it? Depends on what you’re combatting. Random video of a celeberty’s head put into a porn vid just for people’s sexual fantasies? Something in metadata somewhere’s fine.

A deliberate attempt to falsify evidence for nefarious purposes? Much more difficult, obviously, but if I have a photocopy of a typewritten letter, how do it know if that’s legitimate or not?

Indeed. When I was in college, I lived in an area that was heavily agricultural. Every year, the TV news would have a piece on the “plight of the poor migrant workers” complete with film of ragged people in shacks. And they were technically migrant workers. The more accurate term is “winos”. Right behind the film crews were the houses of the actual migrant pickers, where they spent the off season - large, nicely kept ones, with shiny new trucks and travel trailers in front. :rolleyes:

A few years later, my wife and I were going to a Salvadorean restaurant frequently. This was during a period of insurgency in El Salvador, and the news showed shots of some ragged dirty kids to show the hardships - and a street sign. I remembered the street names the next time we were in the restaurant and looked up the location on a map they had on the wall. (LONG before Google Maps) Turns out the crew had their backs to a city park while filming. Then I realized the picture next to the map was that park. Folks, I have not seen many US city parks that were that nice. :dubious:

Journalistic integrity is an oxymoron, at least from my observation.

There are several different technologies all converging on a future where a single video of an event is going be largely useless in terms of telling whether an event happened.

For many of us, it’s not going to change things very much. Right now, there’s a degree of trust and verification that happens with the news I read. There are a number of sources that broadly get things right most of the time – they tend to agree on events that happened, they include the actual data, they try to find corroborating sources and they issue retractions and corrections if something goes wrong.
Video will eventually be in that mix; it will depend what news sites are running it and why they believe it’s legit.

Or, if you’re a person committed to a particular viewpoint, you now can ignore conflicting video and believe supporting video. But that’s already how some people think, it’s not a new phenomenon.


It’s also interesting to think about the value of one’s likeness. Right now having a well-known face is of course extremely valuable. And in the near-term that value may go up because you could sell your perfect, youthful likeness indefinitely. A model with a strikingly attractive face at 22 could keep saying “I’m worth it” in front of whatever overpriced tat they are selling, forever, and I think our culture will also shift such that no-one will care that much about it being a digital immortal. And we’ve already seen in movies actors play as both their current self and more youthful versions where that supports the plot.
Longer-term though, likenesses will lose a lot of value once we can make and convincingly animate entirely digital faces that are unique and have character.

Will Deepfake now discredit video evidence in court?

And the answer is (drumroll please) personal ownership of identity enforceable under law! Which the US does not have now, but other countries have to some degree in the form of privacy rights.

Sure, you can now sue someone for harm caused by their use of your personally identifiable information, but you have to prove the harm. It’s not de facto illegal to do so. This opens the proverbial floodgates for bad actors to create products like the ones discussed above for profit and use a portion of that profit for legal defense. That maximizes the potential for harm and creates the highest barrier to corrective action. Not good.

I hope this will be the spur to action on this issue. But advertisers will scream because they’re invested in using your personal information via Big Data to provide marketing analyses, and that’s big bucks. So there’s an industry dedicated to opposing identity ownership laws (in the US).