Is YouTube’s algorithm biased toward conspiracy theories

Awhile back I was trying to find out some historical info on the ancient city Puma Punku. Probably 95% of search results came back with some wacky conspiracy theory (aliens, melting rocks, etc).

Very frustrating. But lately these bizzaro results are included with many of my searches. Is it me or is it the algorithm?

And if it is the algorithm, is it purposely done?

I’ll bring up a third option. The content uploaded to YouTube is biased towards wootastic and whackadoodle. The algorithm may simply not be biased to try and exclude it.

I have seen articles that suggest the YouTube algorithm does tend towards extremes. If you search a certain area, and then come back to it later, it will feed you more extreme versions, such as conspiracy theories.

Note that this isn’t a left or right bias. The article said that it tended towards the more extreme leftist side if you searched routine Democrat topics, and to more extreme rightist side if you started searching Republican topics.

One theory is that the algorithm is biased to keep people searching, and more extravagant results are more likely to catch your attention and keep coming back.

So…YouTube is porn?

Youtube seems to enjoy spamming me with adverts for Grammarly. They’re wasting their client’s money.

It’s statistical, showing you results based on what patterns other people have followed in the past. All roads lead away from Rome, while they also lead to Rome.

YouTube is showing you a schizotypal road because a lot of people have taken that road before. “A lot” here is relative, but catering to wackos makes financial sense when the content is narrowcast but the advertising is broadcast.

Wait, so you suppose that there is some nefarious intent behind YouTube’s search algorithm to favor conspiracy theory content? That’s… kinda meta.