Today I got fooled by an AI video

Welp, I almost got fooled by a Facebook AI reel this morning, but on closer examination I think it is fake.

No, it’s real. My mother-in-law has the same appearance.

Where toothpaste comes from.

Probably not real.

Indeed - I think it’s probably going to advance to some sort of hyper-reality state where not only do the fake videos look real, but they also include subtle psychological programming that makes humans tend to believe them more than they would believe actual real footage.

ETA: I realise the above maybe sounds far-fetched, but by way of analogy, if I asserted that the above statement was backed by rigorous research and I linked the words ‘rigorous research’ here to a video of Rick Astley, people might read this post and be more inclined to believe it just because it contains what appeared to be a citation, without ever clicking on the supposed citation.

That sort of technique could probably be applied to video - inclusion of some features of some kind, that provide a sort of pre-emptive mark of authenticity.

Happy Halloween, everyone.

Ah, I see, the random rectangular blurs are hiding a “Sora” label that’s automatically added to the videos. Except that they didn’t blur out all of them, and some of them don’t have that.

Quite possible that at least some of them are real (except, probably, the penguin). I’d guess that someone saw a handful of real ones, and decided to make a bunch more to fill out a long video.

I’d have liked to see one where the raccoon just gives the witch or whatever an annoyed glare, and kept on raiding the candy.

There are already (ai) tools that can cleanly remove the watermark.

Another trend is door camera videos of dogs and cats firing various types of military weapons and a human tackling them to take it away from them. I haven’t seen a compilation of those, but here’s one example.

I saw a video of a cat walking with a cane and yeah that’s no real.

Slightly OT, but I encountered what might be a case of AI being fooled by AI the other day.

I made an AI-generated song to use as a backing track for a video. I specified “blues rock” as the genre and input some of the lyrics from “Dem Dry Bones”. I tried several different websites, picked the one I liked best, uploadied the video to youtube…and got a copyright notice. Apparently the AI-generated guitar solo matched the lick from “Schoolboy” by Fenton Robinson. No harm done, it wasn’t a copyright strike, and I don’t monetize my videos anyway.

Here’s the odd part: I uploaded the video to my backup youtube channel and got a copyright notice again. But…this time it was for “What My Mama Told Me” by Junior Wells. I found it curious that youtube’s AI algorithm found two different matches for the same AI-generated guitar solo.

Odds near 100% every last one of them is from Sora2. That’s setup is a current popular meme in the Sora2 feed. Hundreds of different users are doing variations of it. The platform encourages that with a convenient “Remix” button.