Do echo chambers actually exist on social media? By focusing on how both Italian and US Facebook users relate to two distinct narratives, social scientists Walter Quattrociocchi, Antonio Scala and Cass Sunstein found quantitative evidence of how users tend to promote their favorite narratives, form polarized groups and resist information that doesn’t conform to their beliefs.
In a draft paper called “Echo Chambers on Facebook,” The study focused on how Facebook users interacted with two narratives involving conspiracy theories and science. Users belonging to different communities tended not to interact and tended to be connected only with “like-minded” friends, creating closed, non-interacting communities centered around different narratives — what the researchers called “echo chambers.” Confirmation bias accounted for users’ decisions to share certain content, creating informational cascades within their communities.
Users tended to seek out information that strengthened their preferred narratives and to reject information that undermined it. Alarmingly, when deliberately false information was introduced into these echo chambers, it was absorbed and viewed as credible as long as it conformed with the primary narrative. And even when when more truthful information was introduced to correct or “debunk” falsehoods, either it was ignored or it reinforced the users’ false beliefs.
While the findings are cause for concern, they don’t come as much of a surprise — confirmation bias is nothing new, and conspiracy theories have become an increasingly visible part of our political discussion. The question is whether there is anything a responsible media can or should do differently to make it easier for facts to penetrate these echo chambers, and whether news organizations are willing to make the necessary changes.
Confirmation bias doesn’t begin to describe what Facebook offers partisans in both directions: a limitless, on-demand narrative fix, occasionally punctuated by articles grounded in actual world events, when those suit their preferences. But it was the Trump camp more than its opponent that encouraged this social media story time. Trump propagate conspiracy theories plucked from Facebook and Reddit, and Twitter, and 4chan, and … but also to plant what would become the next social media story.
The science-fiction writer William Gibson, one of the few “futurists” who actually gets the future, reads Trump’s lies in strictly Machiavellian terms. “Trump tells misinformed people what he knows they already believe, and thus ‘he speaks the truth,’” Gibson tweeted.
The truth, then, according to Gibson—or at least, Trump’s truth—is that the real estate mogul lies to confirm a particular truth to his political constituency. So the scheming Trump can both hate and love Hispanics, Gibson suggests. Simultaneously.
And after, a Trump supporter logged into Facebook, they were likely greeted by a cascade of contextless, often deliberately falsified Facebook “news” stories, the sort detailed by John Herrman in an New York Times report from August. On Facebook, a meme-ified image promising new details of Hillary Clinton’s brain disease would appear alongside an advertisement, a Wall Street Journal investigation, a video game trailer, a baby picture.
Whether or not Facebook is directly culpable, this much can’t be overstated: The combination of a media literacy nadir combined with an unstoppable firehose of untrue media gave Donald Trump the ability to say virtually anything during a presidential election, without consequence. There’s no reason to believe this won’t continue to happen in every election hereafter, to say nothing of the rest of the world, where Facebook is desperate to plant roots.