Over a year ago we asked the open question about whether online news aggregators can inadvertently create a biased thread of news. Of course, without data, we can only speculate.
Well, we've got some data. For now, let's leave out traditional papers like the Wall Street Journal, cable news like Fox or CNN, or even stuff like Huffington Post and the Weekly Standard: we already know a lot of folks tend to read the news that confirms their opinions, but a lot of folks also at least know the news they're getting is of a particular slant. So at least the brain's somewhat engaged.
Social media has made this more sinister. As we discussed in Wedged, we're seeing folks "un-friend" folks they disagree with. We can also surmise that people are more likely to share the news that represents what they already think, as they want to broadcast this to "convince others" (or, as likely, paste on themselves badges of tribal identity). The hypothesis here would be that you're going to be exposed, by your friends, to heavily biased news.
The Wall Street Journal finally gives us some data by doing a live study. They seeded two feeds, one "very conservative" and one "very liberal" to see who shared them and what people's personal feeds would look like. It's very much worth checking out to see what kinds of "red" and "blue" style news stand side-by-side for different issues like transgender people in bathrooms, guns, Trump, Clinton, etc.
What's particularly insidious about this is that Facebook wants you to click stuff (particularly paid stuff), "like" it, share it, or otherwise react to it. And boy, do they have some good data and algorithms for this. So this means that not only are you probably already in a bubble where friends are less likely to share stuff you disagree with, but even when they do, your lack of positive reaction in the past means you're even less likely to see it--Facebook is out to make money, not be your parent.
What's tough about this is that our friends have natural social credibility. When we see stuff shared by them (whether or not they read it and thought about it), we have a natural tendency to trust them, rather than a talking head we don't know, or a disembodied newspaper. It's really hard not to dig deeper down into the echo-chamber, and we are likely to grow ever more detached from any perspectives that challenge ours.
And echo-chambers are dangerous for democracies, for two very distinct reasons: first, they mean that we don't get exposed to new ideas, so our brains turn off and we become mindless--and overly-confident--automatons. We become easily manipulated. But second, and perhaps even more importantly, we become so detached from others' points of view that they become inscrutable and alien. Imagine trying to have a conversation with someone from ISIS: it's too horrid to even imagine, because we find their value system and morality absolutely repugnant (and of course, they probably find ours similarly so). But if we have no eye into the thinking of other political leanings in the US, these folks become repugnant to us, as well.
The 2016 Election
I've noticed--since I live in Cambridge--lots of vitriol between Clinton and Sanders supporters. The Democrats are at nearly as much a risk of tearing themselves apart as Republicans seem to be. I imagine--and have some anecdotal evidence--that Sanders and Clinton supporters each get bombarded by news that props up their candidate as a saint and bashes the other as a maniac or corrupt monster. They see over and over scathing attacks of the other candidate, and probably very little good or even reasonable about them. Far from simply the Republicans and Democrats hating on each other, I think we're seeing internal factions grow in power as our echo chamber slices grow thinner and we burrow deeper. No longer do we think we mostly have stuff in common, even with our own party: their thoughts are intractable and horrifying, and we can't see past that.