The adaptation will be people learning that you can’t automatically trust photos or videos. Don’t be so credulous. (Of course, it was already possible to heavily manipulate photos and video, they just required more time and skill. But nobody regulates Photoshop or Maya, and nobody should.)
Who’s being credulous about what?
The adaptation is what happens after people learn you can’t automatically trust photos or videos.
The thing is, I would have thought the same of the Internet in general: that it’s so full of bullshit, people would adapt and not not trust everything they read/hear/see or at least try to verify.
I’ve been sorely disappointed with that assumption. If anything, it’s paradoxically had the opposite effect.
I’m saying the adaptation to videos and photos that you can’t trust is to learn that you can’t trust photos and videos.
But if, for example as you mentioned earlier, someone used fake product photos on Amazon, you would have the exact same options as you would have with any other type of consumer fraud you are the victim of. Complain to Amazon. Complain to your credit card company. Take it up with police.
(Amazon BTW, is filled with fake products without the need for AI.)
Right. And how much more funding are we giving the police to cope with the rise in fraud claims? Is that coming out of other parts of their budget, or other government spend, or are we raising taxes? Or are they just not going to deal with the bulk of the new claims? How much more are Amazon and credit card companies charging to cover their rising costs? Or are they rewriting their terms to put it all on the buyer? Does this create a market for lemons problem?
Low trust societies are bad. They impose unnecessary transaction costs that benefit nobody. Learning that we can’t trust visual media is not a solution, it is the problem.
Too bad? That cat was out the bag decades ago when video editing techniques got convincing enough to fake videos. AI is a difference of scale, not degree.
You already could not trust visual media, for decades now, unless it was authenticated (in which case you are placing your trust in the authenticator, not the video itself). None of that is going to change.
But difference of scale really matters! That’s why we’re all so excited about this new technology and why we think it’ll make a huge difference to people’s livelihoods for good and ill! It can’t simultaneously be this amazing revolution that’s going to democratise and cheapen authentic image creation and also not a big deal compared to a relative minority of highly trained Photoshop experts labouring painstakingly to eliminate artifacts that we have today.
It’s not a big deal in one specific area - it does not make it so we can no longer trust visual media, because in 2016 and even 2000 we also could not trust visual media.
But this is just market for lemons stuff
If the chances of buying a lemon of a car are 1 in 1000000, then I just buy a car.
If they’re 1 in 1000 I do my homework
If they’re in 1 in 10 I don’t buy a car.
I don’t want to not buy a car, but the chances that I get stuck with a lemon are so high that I prefer to stick with my shitty old one.
Similarly, I nowadays trust most visual images unless they’re so astonisshing or conveniently in line with what I want to see that it makes me suspicious. But if the ratio of fake images to real reaches some threshold I will switch to default not trusting any image without doing the work of verification and that is a net loss.
That’s where we differ. I assume everything I see online is photoshopped or at the very least misleadingly cropped unless I can verify the source.
Good example!
It used to be difficult to fake photos, and nearly impossible to fake videos/film. In the recent past it’s become relatively easy to manipulate photos, and difficult but not impossible to manipulate video. In both cases though, they’re generally edits to real footage. Now it’s suddenly possible to generate completely made up images that look real. Completely made up video are not far behind. That’s an enormous jump in scale. Previously we could generally trust but verify. Now we’re rapidly heading into a world where we’ll have to distrust everything without rock solid guarantees. How that’s achieved is another discussion.
AI is going to strengthen authoritarianism. Strengthen original sources.
A trillion worthless images of no value are still worthless. That you created them with AI, who cares, they have no value. Anything you can create can be created by someone else. That is what people are missing.
People are going to be looking for value. So they will need to be more trusting of source. This strengthens authoritarianism. Trust only me. Now that source may be lying, but when the world is lying people are going to make choices. AI is going to democratize nothing. Quite the opposite.
I’m thinking here about past instances of technology changes that produced social changes.
Humans have had some ability to destroy their own environment for thousands of years. Honestly, other animals can do the same. Here I am specifically thinking about changes in hunting and trapping.
Hunters and trappers used to have a lot more latitude in what they did. Partially because the limited tech kept them from doing too much. Then, in the 1800s, the tech improved. So hunters and trappers could turn commercial and harvest the wild animal populations for sale at market. This proceeded to decimate wild animal populations to near or actual extinction. So those nascent industries went away, never to return. Hunting and trapping became highly regulated. The tech continued to improve, but you couldn’t use it to do what people used to do, because we need to have a livable world.
There is destruction coming. Not all of it can even be tolerated. We will have to build and maintain a new livable world. I don’t care what your new tech does, if it is destructive you aren’t going to be able to use it. We are entering times when people saying “step on the brakes” are going to be RIGHT more often than the “full speed ahead” people. Some of the people saying “step on the brakes” are going to be doing so for their own bad purposes. I’m seeking to be in a group that does it for the right reasons. There’s going to be a lot to do.
Praytell, what technology developed in the 1800s made trapping so much more efficient?
The explosive growth of the fur trade and subsequent crash in animal populations was brought about by societal changes.
I’m no expert, but probably improvements in transportation (trains) made it more feasible.
I think with the image, video, audio manipulation there will likely be technology based increased security. The original, unaltered will carry security/stamping that can’t easily be removed. If you or AI provide edits, those will be stamped, video gets an abridged stamp, a cropped stamp, a tinted stamp. Insertion or warping of content gets it an “art” stamp, aka this isn’t close enough to real any more. Whether it’s you or AI making the changes.
Multiple curb high barriers and fraud/defamation laws will take care of most of it.
The true, real versions of things have value. There is a marketplace for them. For historical content, there will be a marketplace for sourcing and vetting content.
How we do things will be changed. But this has happened a million times before and will happen again. Eventually you are left with a few people bemoaning the loss of the “wild west old days” and most people just move on and accept it.
The “let 'er rip” days of technology really never exists. AI will be no different.
At the risk of striving too hard for topicality, this story seems like a useful illustration of some of the issues that arise when we lose sight of the value of an authentic image:
Now, this isn’t an AI generated image, AFAIK. The most likely explanation is that they took a bunch of photos of the family and edited together a composite so everyone was smiling at the same time. Which I’m sure is reasonably normal practice but totally overlooks the purpose of the photo. Imagine the conversation:
Palace Person 1: Princess Kate is recovering from her operation and not doing public appearances.
Palace Person 2: Yes, I hope she feels better soon.
PP1: Me too. But people are so used to seeing pictures and video of her that the fact they haven’t seen any for a few weeks is driving them mad and a lot of silly but also upsetting and annoying conspiracy theories are doing the rounds.
PP2: So what should we do?
PP1: We’ll arrange a family photoshoot and share it. A genuine authentic picture that definitely shows there is nothing funny going on, that people can totally rely on and will put all these silly conspiracy theories to bed once and for all. The camera never lies.
PP2: Great idea! Of course, it’s difficult to take a great pic of three kids at once but that will just go to show…
PP1: Good point. So, given everything I’ve just said, do makes sure you create a composite, manipulated image full of artifacts that slavishly obsessed conspiraloons can find and which are so obvious that even sensible people will have to agree that this isn’t in fact a real photo.
PP2: Won’t that…somewhat undercut the whole point of taking this photo?
PP1: WE HAVE AN IMAGE TO MAINTAIN
For a news photo? No. That is not a “normal practice.” It is 100% considered unethical for such a composite to be released as an editorial photo, if that is what happened. Period. That could and should get you fired. ETA: I worked for AFP, who were mentioned, in the late-90s, and that’s exactly why they killed it. They found evidence of compositing. A no-no if there is one for news agencies like them.
She did it herself.
Yeah, this is the point. It’s unethical for a news professional to use a composite and present it as a genuine capture of reality. It is perfectly normal for a humble mum such as Princess Kate to monkey around with her photos to present the best image. There’s a clash here between different views of what images are for and why we create them that definitely predates AI generation (but doesn’t predate e.g. instagram filters). That clash is only going to become more dramatic the easier it gets to produce inauthentic images and the more we learn to value authentic ones, if only for their comparative rarity.