Internet shows us what it thinks we want to see

A recent article describes how Facebook and Google “help” us by showing us what we “want” to see.

Obviously this exacerbates the bipolarization problem in American politics.

I think that this is the exact problem with the internet. It has quickly become, instead of a Whole General Mish-Mash of information and mis-information, an inescapably marketed and targeted collection of information and mis-information.

Just one more example of how ease-of-use (or, more correctly named, speed-of-use) is basically synonymous with ‘unquestionably better,’ and blinds us to the fact that monied interests make millions by removing options and choice from our daily lives.

And get off my damned lawn, you whipper-snappers!

You mean just “polarization”.

In any case, I disagree completely. Anybody can read any websites they want. NY Times users can go to the National Review site and vice versa. You can’t really blame a search engine if you don’t. My exposure to the other side has increased dramatically.

What the Internet has done is expose us more to the other side in its raw state, not as filtered by the MSM (not to mention the fact we now know how much the MSM does filter the news). In other words, the Internet doesn’t create the polarization: it exposes it.

That’s a good thing in the long run. That is, unless you liked the MSM’s filtering.

Well, take this to its logical conclusion and you have a Web that basically knows what you and your friends like and limits your exposure to only that.

How would you even find out about other sources? How would I even know that the sources I’m accustomed to are biased and that I need to even look for other sources if me and everyone I know are going to the same small cabal of sources?

Simple availability isn’t enough; content must be available and popular to have mainstream impact – this is why websites live and die by their Google PageRank.

And this isn’t happening 20 years from now. It’s already here. My Facebook is limited to updates from the few friends I’ve had lengthy interactions with and feel-good updates from organizations whose posts I’ve offhandedly “Liked” in the past. My Google searches are getting more and more relevant but also becoming more and more limited and localized since they implemented a feature that returns results favored by others in my geographic area.

And National Review? I’ve honestly never heard of it before your post. Google News usually leans towards the same few repeat offenders.

It’s not really “blaming search engines”, it’s recognizing a part of human nature that tends to trust what it’s familiar with and what it already knows it likes, even at the expense of the truth. I don’t have cites handy, but there has been psychological research that’s pointed out those effects. Personalized content delivery only amplifies that flaw and makes us even less likely to see the bigger picture, even when it’s necessary.


On the other hand, I’m not sure that objectifying algorithms is going to help much. Even if you present people with contrary information, without some sort of way in or a good amount of relevance, they’re just going to ignore it/skim it and forget it/think it’s a conspiracy/dismiss it altogether.

This only tackles one part of the problem (making sure contrary opinions are available), but the second part – the harder part – is getting those contrary opinions past people’s cognitive firewalls.

The internet is exacerbating this but this is a trend that reaches back into the late 60’s/early 70’s. Bill Bishop has written an excellent book examining this issue titled The Big Sort.

Long story short, the increased economic mobility of Americans allowed them to cluster into like minded communities. One interesting tid bit is that because liberals are more likely to move than conservatives, liberals are actually demographically more homogenous in their information diets.