Take it from the president of the United States: Human beings are creatures of comfort. We're not particularly inclined to seek out contradictory information and would rather believe things that reinforce our worldview.
Sometimes we post those things at 6 a.m. to make a point to our foes, even if our claims are fact-free.
Most of the time, we pay a price for that willful ignorance, but researchers are on the brink of learning just how destructive those impulses can be when combined with the scale and power of Facebook.
"People fall into groups where they reinforce their views and ignore dissenting views."
We already know that people create echo chambers on the social media platform. A new study on how people consume news on Facebook shows how widespread those filter bubbles have become — and suggests that remedies for the spread of so-called fake news, like fact-checking and blacklisting offenders, won't come close to fixing the problem.
So blame the media and Facebook all you like for our dysfunctional politics and the collapse of bipartisanship, but the most formidable enemy appears to be human psychology.
The study, published in Proceedings of the National Academy of Sciences, looked at the news consumption patterns of 376 million users over a six-year period from January 2010 until December 2015. The researchers analyzed how those users engaged with 920 English-language local and global news sources, including outlets like the New York Times, Guardian, Huffington Post, Daily Caller and Associated Press. (The sample was based on a list compiled by the European Media Monitor, and included government agency and nonprofit sites as well as notable omissions, such as Breitbart, Buzzfeed, and yes, Mashable.)
Even with dozens of news sources at their fingertips, users typically engaged with just a handful of outlets by liking their pages and commenting on their posts. The researchers found that the more active people are on Facebook, the more they consume news in clusters, basically walling themselves off with a single community of news outlets.
"People fall into groups where they reinforce their views and ignore dissenting views," says Walter Quattrociocchi, a principal investigator of the research and head of the Laboratory of Computational Social Science at IMT Lucca in Italy.
Quattrociocchi and his study co-authors did not try to gauge the politics of either the news outlets or the readers. Instead, they argue that polarization dominates news consumption on Facebook because highly active users engage with a limited number of outlets. This dynamic, the researchers say, probably plays a significant role in the way misinformation spreads, though they did not specifically track it in this study.
Basically, Quattrociocchi says, users are searching for narratives, exhibiting what's known in psychology as selective exposure: They tend to favor information that fortifies their pre-existing views and avoid outlets that might challenge those beliefs.
"You look for the ideas [you agree with] and you refuse any kind of contact with something else."
While they might have also done that in far simpler times, by sticking to their favorite evening broadcast and local hometown paper, the difference is that people then didn't have access to countless publishers lacking journalistic credibility, and the power to immediately circulate questionable claims amongst dozens or hundreds of family and friends.
Plus, 20 years ago, news outlets didn't have to to compete against, for example, videos of kittens or vacation selfies, a dynamic the researchers note, and one that has created new incentives for publishers to present information in conversational or emotional terms.
Facebook, Quattrociocchi says, has "destroyed the architecture of the media" and trying to stem the spread of so-called fake news is near-to-impossible because of how motivated people are to feel they're right about the state of the world.
"Most of the proposals are related to debunking and fact checking," says Quattrociocchi. "But the problem behind fake news and misinformation is the polarization of users. You look for the ideas [you agree with] and you refuse any kind of contact with something else."
Facebook seems to understand this problem, and has arguably contributed to it by employing algorithms that haven't frequently shown users links to news sources they may find disagreeable. Indeed, the more comfortable you feel on Facebook, the more likely you are to return. There's little incentive for the company to make people squirm by positioning provocative news outlets high in their feeds for every visit.
"People will not exit the echo chamber. The problem is how we enter."
Yet, even Mark Zuckerberg, Facebook's CEO, gets that echo chambers on the platform wreak havoc on public discourse. In a recent post, his vision of an "informed community" on Facebook included tamping down on sensationalism and polarization.
The company, for instance, had noticed that some people shared content with sensational headlines but had never actually read the story. The algorithm, it appears, will now take "signals" like that into account in order to both reduce "sensationalism" in people's news feeds and identify the publishers responsible for that content. Facebook also recently rolled out tools to help users identify and dispute fake news items.
Quattrociocchi, however, isn't optimistic that fact-checking and debunking mechanisms will make a difference when it comes to believing and sharing conspiracy theories and falsehoods.
"People will not exit the echo chamber," he says. "The problem is how we enter."
Though third-party apps and browser extensions to burst filter bubbles have cropped up since the election, those require a lot of work for people who are mostly content inside their clusters. Quattrociocchi says that even being aware of your self-made Facebook echo chamber is an important first step, but he believes it's up to companies and researchers to better understand people's news consumption habits and reach them where they're at.
Quattrociocchi is currently exploring the cognitive and psychological traits that drive people to consume certain news narratives. One possibility, he says, is that the impersonal nature of online news sharing lowers personal accountability, making it easier to post and believe fake news. He also wants to test whether social media platforms create a kind of narcissistic distortion, influencing how people present themselves online.
Regardless of what he finds, the message is becoming clearer: The problem with fake news has a lot more to do with our own psychology than we'd like to admit.