Echo Chambers vs Epistemic Bubbles

From P2P Foundation
Jump to navigation Jump to search

Discussion

C Thi Nguyen / Sam Dresser:

" Something has gone wrong with the flow of information. It’s not just that different people are drawing subtly different conclusions from the same evidence. It seems like different intellectual communities no longer share basic foundational beliefs. Maybe nobody cares about the truth anymore, as some have started to worry. Maybe political allegiance has replaced basic reasoning skills. Maybe we’ve all become trapped in echo chambers of our own making – wrapping ourselves in an intellectually impenetrable layer of likeminded friends and web pages and social media feeds.

But there are two very different phenomena at play here, each of which subvert the flow of information in very distinct ways. Let’s call them echo chambers and epistemic bubbles. Both are social structures that systematically exclude sources of information. Both exaggerate their members’ confidence in their beliefs. But they work in entirely different ways, and they require very different modes of intervention. An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side.

Current usage has blurred this crucial distinction, so let me introduce a somewhat artificial taxonomy. An ‘epistemic bubble’ is an informational network from which relevant voices have been excluded by omission. That omission might be purposeful: we might be selectively avoiding contact with contrary views because, say, they make us uncomfortable. As social scientists tell us, we like to engage in selective exposure, seeking out information that confirms our own worldview. But that omission can also be entirely inadvertent. Even if we’re not actively trying to avoid disagreement, our Facebook friends tend to share our views and interests. When we take networks built for social reasons and start using them as our information feeds, we tend to miss out on contrary views and run into exaggerated degrees of agreement.

An ‘echo chamber’ is a social structure from which other relevant voices have been actively discredited. Where an epistemic bubble merely omits contrary views, an echo chamber brings its members to actively distrust outsiders. In their book Echo Chamber: Rush Limbaugh and the Conservative Media Establishment (2010), Kathleen Hall Jamieson and Frank Cappella offer a groundbreaking analysis of the phenomenon. For them, an echo chamber is something like a cult. A cult isolates its members by actively alienating them from any outside sources. Those outside are actively labelled as malignant and untrustworthy. A cult member’s trust is narrowed, aimed with laser-like focus on certain insider voices.

In epistemic bubbles, other voices are not heard; in echo chambers, other voices are actively undermined. The way to break an echo chamber is not to wave “the facts” in the faces of its members. It is to attack the echo chamber at its root and repair that broken trust.

Let’s start with epistemic bubbles. They have been in the limelight lately, most famously in Eli Pariser’s The Filter Bubble (2011) and Cass Sunstein’s #Republic: Divided Democracy in the Age of Social Media (2017). The general gist: we get much of our news from Facebook feeds and similar sorts of social media. Our Facebook feed consists mostly of our friends and colleagues, the majority of whom share our own political and cultural views. We visit our favourite like-minded blogs and websites. At the same time, various algorithms behind the scenes, such as those inside Google search, invisibly personalise our searches, making it more likely that we’ll see only what we want to see. These processes all impose filters on information.

Such filters aren’t necessarily bad. The world is overstuffed with information, and one can’t sort through it all by oneself: filters need to be outsourced. That’s why we all depend on extended social networks to deliver us knowledge. But any such informational network needs the right sort of broadness and variety to work. A social network composed entirely of incredibly smart, obsessive opera fans would deliver all the information I could want about the opera scene, but it would fail to clue me in to the fact that, say, my country had been infested by a rising tide of neo-Nazis. Each individual person in my network might be superbly reliable about her particular informational patch but, as an aggregate structure, my network lacks what Sanford Goldberg in his book Relying on Others (2010) calls ‘coverage-reliability’. It doesn’t deliver to me a sufficiently broad and representative coverage of all the relevant information.

Epistemic bubbles also threaten us with a second danger: excessive self-confidence. In a bubble, we will encounter exaggerated amounts of agreement and suppressed levels of disagreement. We’re vulnerable because, in general, we actually have very good reason to pay attention to whether other people agree or disagree with us. Looking to others for corroboration is a basic method for checking whether one has reasoned well or badly. This is why we might do our homework in study groups, and have different laboratories repeat experiments. But not all forms of corroboration are meaningful. Ludwig Wittgenstein says: imagine looking through a stack of identical newspapers and treating each next newspaper headline as yet another reason to increase your confidence. This is obviously a mistake. The fact that The New York Times reports something is a reason to believe it, but any extra copies of The New York Times that you encounter shouldn’t add any extra evidence.

But outright copies aren’t the only problem here. Suppose that I believe that the Paleo diet is the greatest diet of all time. I assemble a Facebook group called ‘Great Health Facts!’ and fill it only with people who already believe that Paleo is the best diet. The fact that everybody in that group agrees with me about Paleo shouldn’t increase my confidence level one bit. They’re not mere copies – they actually might have reached their conclusions independently – but their agreement can be entirely explained by my method of selection. The group’s unanimity is simply an echo of my selection criterion. It’s easy to forget how carefully pre-screened the members are, how epistemically groomed social media circles might be.

Luckily, though, epistemic bubbles are easily shattered. We can pop an epistemic bubble simply by exposing its members to the information and arguments that they’ve missed. But echo chambers are a far more pernicious and robust phenomenon.

Jamieson and Cappella’s book is the first empirical study into how echo chambers function. In their analysis, echo chambers work by systematically alienating their members from all outside epistemic sources. Their research centres on Rush Limbaugh, a wildly successful conservative firebrand in the United States, along with Fox News and related media. Limbaugh uses methods to actively transfigure whom his listeners trust. His constant attacks on the ‘mainstream media’ are attempts to discredit all other sources of knowledge. He systematically undermines the integrity of anybody who expresses any kind of contrary view. And outsiders are not simply mistaken – they are malicious, manipulative and actively working to destroy Limbaugh and his followers. The resulting worldview is one of deeply opposed force, an all-or-nothing war between good and evil. Anybody who isn’t a fellow Limbaugh follower is clearly opposed to the side of right, and therefore utterly untrustworthy.

They read – but do not accept – mainstream and liberal news sources. They hear, but dismiss, outside voices

The result is a rather striking parallel to the techniques of emotional isolation typically practised in cult indoctrination. According to mental-health specialists in cult recovery, including Margaret Singer, Michael Langone and Robert Lifton, cult indoctrination involves new cult members being brought to distrust all non-cult members. This provides a social buffer against any attempts to extract the indoctrinated person from the cult.

The echo chamber doesn’t need any bad connectivity to function. Limbaugh’s followers have full access to outside sources of information. According to Jamieson and Cappella’s data, Limbaugh’s followers regularly read – but do not accept – mainstream and liberal news sources. They are isolated, not by selective exposure, but by changes in who they accept as authorities, experts and trusted sources. They hear, but dismiss, outside voices. Their worldview can survive exposure to those outside voices because their belief system has prepared them for such intellectual onslaught.

In fact, exposure to contrary views could actually reinforce their views. Limbaugh might offer his followers a conspiracy theory: anybody who criticises him is doing it at the behest of a secret cabal of evil elites, which has already seized control of the mainstream media. His followers are now protected against simple exposure to contrary evidence. In fact, the more they find that the mainstream media calls out Limbaugh for inaccuracy, the more Limbaugh’s predictions will be confirmed. Perversely, exposure to outsiders with contrary views can thus increase echo-chamber members’ confidence in their insider sources, and hence their attachment to their worldview. The philosopher Endre Begby calls this effect ‘evidential pre-emption’. What’s happening is a kind of intellectual judo, in which the power and enthusiasm of contrary voices are turned against those contrary voices through a carefully rigged internal structure of belief.

One might be tempted to think that the solution is just more intellectual autonomy. Echo chambers arise because we trust others too much, so the solution is to start thinking for ourselves. But that kind of radical intellectual autonomy is a pipe dream. If the philosophical study of knowledge has taught us anything in the past half-century, it is that we are irredeemably dependent on each other in almost every domain of knowledge. Think about how we trust others in every aspect of our daily lives. Driving a car depends on trusting the work of engineers and mechanics; taking medicine depends on trusting the decisions of doctors, chemists and biologists. Even the experts depend on vast networks of other experts. A climate scientist analysing core samples depends on the lab technician who runs the air-extraction machine, the engineers who made all those machines, the statisticians who developed the underlying methodology, and on and on.

As Elijah Millgram argues in The Great Endarkenment (2015), modern knowledge depends on trusting long chains of experts. And no single person is in the position to check up on the reliability of every member of that chain. Ask yourself: could you tell a good statistician from an incompetent one? A good biologist from a bad one? A good nuclear engineer, or radiologist, or macro-economist, from a bad one? Any particular reader might, of course, be able to answer positively to one or two such questions, but nobody can really assess such a long chain for herself. Instead, we depend on a vastly complicated social structure of trust. We must trust each other, but, as the philosopher Annette Baier says, that trust makes us vulnerable. Echo chambers operate as a kind of social parasite on that vulnerability, taking advantage of our epistemic condition and social dependency.

Most of the examples I’ve given so far, following Jamieson and Cappella, focus on the conservative media echo chamber. But nothing says that this is the only echo chamber out there; I am quite confident that there are plenty of echo chambers on the political Left. More importantly, nothing about echo chambers restricts them to the arena of politics. The world of anti-vaccination is clearly an echo chamber, and it is one that crosses political lines. I’ve also encountered echo chambers on topics as broad as diet (Paleo!), exercise technique (CrossFit!), breastfeeding, some academic intellectual traditions, and many, many more. Here’s a basic check: does a community’s belief system actively undermine the trustworthiness of any outsiders who don’t subscribe to its central dogmas? Then it’s probably an echo chamber.

Unfortunately, much of the recent analysis has lumped epistemic bubbles together with echo chambers into a single, unified phenomenon. But it is absolutely crucial to distinguish between the two. Epistemic bubbles are rather ramshackle; they go up easily, and they collapse easily, too. Echo chambers are far more pernicious and far more robust. They can start to seem almost like living things. Their belief systems provide structural integrity, resilience and active responses to outside attacks. Surely a community can be both at once, but the two phenomena can also exist independently. And of the events we’re most worried about, it’s the echo-chamber effects that are really causing most of the trouble." (https://aeon.co/amp/essays/why-its-as-hard-to-escape-an-echo-chamber-as-it-is-to-flee-a-cult?)