WHY does misinformation spread so quickly on the social media? Why doesn’t it get corrected? When the truth is so easy to find, why do people accept falsehoods?

A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: people’s tendency to seek out information that confirms their beliefs, and to ignore contrary information.

Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.

The new study, led by Michela Del Vicario of Italy’s Laboratory of Computational Social Science, explores the behaviour of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: when people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?

Del Vicario and her co-authors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls,” which intentionally spread false information (using two web pages).

Their data set is massive: it covers all Facebook posts during the five-year period. They explored which Facebook users linked to one or more of the 69 web pages, and whether they learned about those links from their Facebook friends.

In sum, the researchers find a lot of communities of like- minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities.

More generally, Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it.

As Del Vicario and her co-authors put it, “users mostly tend to select and share content according to a specific narrative and to ignore the rest”. On Facebook, the result is the formation of a lot of “homogeneous, polarised clusters”. Within those clusters, new information moves quickly among friends (often in just a few hours).

The consequence is the “proliferation of biased narratives fomented by unsubstantiated rumours, mistrust, and paranoia”. And while the study focuses on Facebook users, there is little doubt that something similar happens on other social media, such as Twitter — and in the real world as well.

Striking though their findings are, Del Vicario and her co-authors do not mention the important phenomenon of “group polarization”, which means that when like-minded people speak with one another, they tend to end up thinking a more extreme version of what they originally believed. Whenever people spread misinformation within homogenous clusters, they also intensify one another’s commitment to that misinformation.

Of the various explanations for group polarisation, the most relevant involves a potentially insidious effect of confirmation itself. Once people discover that others agree with them, they become more confident — and then more extreme. In that sense, confirmation bias is self-reinforcing, producing a vicious spiral. If people begin with a certain belief, and find information that confirms it, they will intensify their commitment to that very belief, thus strengthening their bias.

Suppose, for example, that you think an increase in the minimum wage is a sensational idea, that the nuclear deal with Iran is a mistake, that Obamacare is working well, that Donald Trump would be a fine president, or that the problem of climate change is greatly overstated. Arriving at these judgements on your own, you might well hold them tentatively and with a fair degree of humility. But after you learn that a lot of people agree with you, you are likely to end up with much greater certainty — and perhaps real disdain for people who do not see things as you do.

On the basis of all the clustering, that almost certainly happened on Facebook. Strong support for this conclusion comes from research from the same academic team, which finds that on Facebook, efforts to debunk false beliefs are typically ignored — and when people pay attention to them, they often strengthen their commitment to the debunked beliefs.

Can anything be done? The best solution is to promote a culture of humility and openness. Some people, and some communities, hold their own views tentatively; they are interested in refutation, not just confirmation. Moreover, those who manage social media (such as Google) can take steps to allow people to assess the trustworthiness of what they are seeing, though these efforts might be controversial and remain in a preliminary state.

In the midst of World War II, a great federal judge, Learned Hand, said that the spirit of liberty is “that spirit which is not too sure that it is right”. Users of the social media are certainly exercising their liberty. But there is a real risk that when they fall prey to confirmation bias, they end up compromising liberty’s spirit — and dead wrong to boot.

Cass Sunstein, a Bloomberg View columnist, is director of the Harvard Law School’s programme on behavioral economics and public policy.

—By arrangement with Bloomberg-The Washington Post

Published in Dawn, January 10th, 2016

Opinion

Editorial

Ties with Tehran
Updated 24 Apr, 2024

Ties with Tehran

Tomorrow, if ties between Washington and Beijing nosedive, and the US asks Pakistan to reconsider CPEC, will we comply?
Working together
24 Apr, 2024

Working together

PAKISTAN’S democracy seems adrift, and no one understands this better than our politicians. The system has gone...
Farmers’ anxiety
24 Apr, 2024

Farmers’ anxiety

WHEAT prices in Punjab have plummeted far below the minimum support price owing to a bumper harvest, reckless...
By-election trends
Updated 23 Apr, 2024

By-election trends

Unless the culture of violence and rigging is rooted out, the credibility of the electoral process in Pakistan will continue to remain under a cloud.
Privatising PIA
23 Apr, 2024

Privatising PIA

FINANCE Minister Muhammad Aurangzeb’s reaffirmation that the process of disinvestment of the loss-making national...
Suffering in captivity
23 Apr, 2024

Suffering in captivity

YET another animal — a lioness — is critically ill at the Karachi Zoo. The feline, emaciated and barely able to...