I get a lot of photos and memes on Facebook showing homes and/or cabins that are AI generated. They typically feature something “cool and unique”, but AI has apparently not yet figured out things like “scale”, and there aren’t always entry points. I haven’t seen any in quite some time. Eventually, if you block stuff, the algorithm will figure out that I don’t want to see any more AI generated videos of houses/cabins.
Of course, I don’t know if this correctly answers the question or not.
Well, it tells me that different people get extremely different suggestions from YouTube. The only stills i see are from Minecraft YouTubers that i follow. And they are rare enough that they often confuse me, as i try to click on them to make them play.
I’ve recently seen images lately of “survival shelters” which are hilariously overdone AI images depicting a suburban style home with essentially a midrise building buried below it. Each one then has comments (maybe bots, maybe just idiots) saying “Wow, this is amazing! How much does it cost to build?” There’s a million obvious reasons why it’s not a real thing but…
I’ve also seen numerous AI generated images of either “reclaimed ruins” or “natural wonder” with some overgrown shopping mall that looks like it should be filled with Bandar-log from The Jungle Book or a “hidden lake” surrounded by massive crystal structures.
I don’t follow or engage with any AI groups on Facebook but I’m not surprised to see these. They’re super low effort to make, just prompt “Ruined high school overgrown by nature” and post it. Making fake Youtube videos is also fairly easy if you look up how to do it and use automation tools but still images are basically effortless especially if you don’t care much about the quality.
Edit: Now that I think about it, there ARE videos also made from these sort of low effort image generation. Basically “Top ten places you WON’T BELIEVE!” where some robot voice describes the wonder of these imaginary places with the still images posted. Or maybe they do some five second ‘picture to video’ animation of them. They also have them for things like imaginary concept cars that auto makers are totally-for-real working on. I don’t get these myself but I’ve seen other Youtube videos complaining about them.
I think that’s true of the long form content. For short-form, it’s not always necessarily about the quality of the content - it can be just as profitable to have people scrolling continually in hope that the next thing will be interesting. And apparently YT Shorts has overtaken TikTok now.
Just went through my YouTube feed. I do not follow or subscribe to any channel. My videos are primarily Rasslin, music, animal videos and comedians. Only one video (boxing) looked like it was AI.
What really irritates me is my Google News feed will once or twice a week post a Stewartville Star article. The Stewartville Star is an AI news site that seems to be scraping UK media for content.
My LOL-est YT AI content recently, one of those “history” clips that feed you a lot of processed stock footage and a bog-standard text-to-voice narrating a script that says nothing new: about the Spanish Civil War and its aftermath, where all the war images were obvious AI reprocesses of general stock war pictures, but the kicker was that when they referred to Franco, someone for whom we have plenty of photographs from life, pre-WW2, for some reason the AI plugged in some Generic Dictator Image while OTOH accurately picturing ol’ Francisco in the postwar segment. I was going like, I’ve seen slop, but this is sloppy slop. They did not even match the parts of the video to one another.
(BTW the legit history 'tubers are just livid at AI slop. They have to think before posting because YTs censorbots may dock them for showing real pics of things that really happened or quoting things that were really said, but then any yutz can just put up slop that doesn’t even look like anything period-related)
A vast number of ads on youtube break their / google’s supposed guidelines, so I am not expecting them to start policing AI slop within video content itself very well.
Genuinely, just before typing this, I watched an advert where a deep fake Kier Starmer (UK prime minister) told me how he invested just £200 and it now passively earns him £20,000 a month. He also said the government would match whatever profits I make. I’d be mad not to invest!
I’ve been seeing these deepfake ads for months now; they don’t care.
I’d noticed a heavy uptick in two kinds of YT AI slop in my own feed:
AI retellings of Star Wars and other SF stories backed with AI still/short clip images in a old movie style - I wanna say Panavision? But the women in the thumbnail images and mousover shorts are all hilariously, exaggeratedly, sexual. Like, fetish-level stuff.
AI readings of “Humanity, Fuck Yeah” fiction from Reddit, backed with AI still images illustrating the stories.
Clicked on a couple of the latter to see what they were on about, never clicked on the former, but blocked them both now.
Yeah, I think some channels have the entire process automated.
I sometimes stumble across videos where it’s a collection of movie clips with the most bland descriptive voiceover like ‘the man points the gun and shouts words at the woman with the red hat’.
I had not been noticing AI fakes in my YT feed (other than videos with fake AI voiceovers). But after reading this thread yesterday, this video shows up in my feed today. That is one extreme ride, alright!
Unfortunately, just by opening and reading this thread I think you’ve opened yourself up to it, just like I did.
I don’t know how many times I’ve read a SDMB thread on a particular topic, only to immediately get YT videos in my feed directly related to that topic.
It’s actually almost charming in its absolute ridiculousness-- a very awkward looking crane-like ride attached to a very flimsy looking deck hanging off the side of a canyon very predictably breaks off, because absolutely no one would have thought it could possibly be supported that way, not even if the 3 Stooges had been hired as ride engineers, and Charlie from ‘Always Sunny’ had been the inspector; then it plummets down a very deep canyon, to screams of children that sound more like normal amusement park ride screams than those of sheer terror.
I just took that experiment, opening another tab of this browser session to YT & scrolling around the home page. No sign of anything other than the usual BS. Yet.
I don’t subscribe to anything and my search history is rather wide, not deep.
I wonder if the clip was cut short of the thing hitting the water by the length limit of whatever thing they used to generate this, or if they cut it because the water boiled and spurted when the ride hit, as water splashes tend to do in AI video.