SOCIAL media has become ubiquitous in modern life — and so, too, has its propensity to be misused to spread false information with serious real-life consequences. The largest of such platforms, Facebook, has been undergoing a reckoning of sorts over a series of scandals, including revelations of privacy breaches, proliferation of hate speech and election manipulation, most notoriously of the 2016 US presidential election. In Sri Lanka, Myanmar and India, rumours spread largely through Facebook and its subsidiary, WhatsApp, have led to attacks on ethnic and religious minority groups. In a whirlwind campaign in recent months, the company has announced new checks and balances, specifically to protect elections — including Pakistan’s — from interference. In the run-up to July 25, it is now also publishing advertisements in leading local newspapers, including Dawn, informing the public of how to identify false information in Facebook posts and forwarded WhatsApp messages.

That Facebook is attempting to quell the epidemic of misinformation on its platform is, on the surface, a positive move. And the fact that it is specifically working to educate the public on how to discern the credibility of information on social media may, hopefully, have some impact. But, beyond image rehabilitation for the social media giant itself, how effective will such measures be? Fundamentally, Facebook profits from the commodification of its users’ data, and it has a track record of cooperating with state surveillance and censorship across the globe. How will it ensure, for example, that the systems it has introduced to curb fake content can’t be ‘gamed’ by vested interests to use to their advantage? Who gets to decide what is or isn’t ‘fake’? And what guarantee is there that it won’t be party to mass state censorship to prevent it from being banned in any more countries? All these are issues that Facebook has yet to adequately address. Even if such measures could be tamper-proof, there is no realistic way to check the millions of fake posts being spread through private channels such as Facebook Messenger, closed or secret Facebook groups, or WhatsApp messages. No amount of digital literacy outreach will quell users’ inherent bias towards information that validates their worldview. In that way, the medium is very much the message. These problems are so systemically entrenched in such platforms that only a complete reassessment of how they operate may make a true difference.

Published in Dawn, July 20th, 2018

Opinion

Editorial

First steps
Updated 29 May, 2024

First steps

One hopes that this small change will pave the way for bigger things.
Rafah inferno
29 May, 2024

Rafah inferno

THE level of barbarity witnessed in Sunday’s Israeli air strike targeting a refugee camp in Rafah is shocking even...
On a whim
29 May, 2024

On a whim

THE sudden declaration of May 28 as a public holiday to observe Youm-i-Takbeer — the anniversary of Pakistan’s...
Afghan puzzle
Updated 28 May, 2024

Afghan puzzle

Unless these elements are neutralised, it will not be possible to have the upper hand over terrorist groups.
Attacking minorities
28 May, 2024

Attacking minorities

Mobs turn into executioners due to the authorities’ helplessness before these elements.
Persistent scourge
Updated 29 May, 2024

Persistent scourge

THE challenge of polio in Pakistan has reached a new nadir, drawing grave concerns from the Technical Advisory Group...