Who removes Kashmir posts on Facebook?

Published July 29, 2016
Kashmiri protestors clash with Indian police. AFP/File
Kashmiri protestors clash with Indian police. AFP/File

The debate on Facebook’s controversial moderating policies and processes hit Pakistan this month when actor Hamza Ali Abbasi’s public page was suspended for three days following posts on the Kashmir protests and the killing of militant commander Burhan Wani.

Since then, it has been widely reported how Facebook’s policies have effectively censored thousands of users who expressed their opinions on the decades-old conflict plaguing Kashmir.

In a statement to The Guardian regarding Kashmir, the social-media giant said:

“There is no place on Facebook for content that praises or supports terrorists, terrorist organisations or terrorism. We welcome discussion on these subjects but any terrorist content has to be clearly put in context which condemns these organisations and or their violent activities. Therefore, profiles and content supporting or praising Hizbul Mujahideen and Burhan Wani are removed as soon as they are reported to us. In this instance, some content was removed in error, but this has now been restored.”

It isn't all automated

Content on Facebook is filtered by a combination of human moderation and automated removal.

The automated process only detects and flags content that may need moderation. If posted content falls into categories that are explicitly banned in Facebook's community policy, they are removed.

An example is the recent footage of child detainees being tortured in Australia. Facebook stated that a video violated their policy on child nudity by showing the bare back of a minor, which is why it had to be taken down.

‘Context is important’

When it comes to things that are more abstract, such as hate speech, Facebook’s head of policy, Monica Bickert said context is important.

“We look at a how a specific person shared a specific post or word or photo to Facebook.”

TechCrunch, which spoke at length with a Facebook spokesperson in the wake of Buzzfeed and TechCrunch having citizen journalism content temporarily removed, clarified Facebook’s policy for Live video ─ and other content ─ as thus:

The policy on graphic content is that Facebook does not allow and will take down content depicting violence if it’s celebrated, glorified or mocks the victim. However, violent content that is graphic or disturbing is not a violation if it’s posted to bring attention to the violence or condemn it. Essentially, if someone posts a graphically violent video saying “this is great, so and so got what was coming to them,” it will be removed, but if they say “This is terrible, these things need to stop,” it can remain visible.”

Whatever the case, the process of moderation seems to be labour intensive and Facebook is not always able to rein in the mass amount of content that is posted.

Context is also important when it comes to posts related to terrorism, as a Facebook spokesman in India tells The Washington Post, saying discussion is welcome but condemnation is key.

“Our Community Standards prohibit content that praises or supports terrorists, terrorist organizations or terrorism, and we remove it as soon as we’re made aware of it.”

“We welcome discussion on these subjects but any terrorist content has to be clearly put in a context which condemns these organisations or their violent activities.”

Moderation isn't always done by Facebook employees

Facebook and other social media giants outsource much of the work to firms in places like the Philippines.

These curators ─ often young university students ─ are not direct employees of Facebook.

Ultimately, sites are subject to the laws of the country they operate in, forcing them to take action at governments’ behest.

In January, Facebook solicited the services of a German company following outrage from the German government over anti- foreigner comments on the portal. A hundred moderators were hired to filter the high volume of content that was deemed bullying and racist in the wake of the 2015 influx of refugees in Europe.

It's not just Kashmir

Criticism of Facebook’s content policies and sanctioning of posts on the network is not new.

Activists and protesters posting about the conflict in Palestine or those behind the ‘Black Lives Matter’ movement in the United States are no strangers to the fact that posting with a certain point of view can have repercussions ─ including arrest.

Just this month, the Public Security Minister of Israel, Gilad Erdan, called Facebook a “monster”, blaming the website’s inability to police content it deemed hateful.

The Israeli government blamed Facebook for street attacks by Palestinians which it claimed were being incited through online posts.

However, Facebook has also carried out what can be called positive discrimination.

Mark Zuckerberg launched an investigation this year after it was revealed editors avoided content that had a conservative bent on ‘Black Lives Matter’, giving the subject greater prominence in the network’s ‘Trending Topics’.

In conclusion: it is a combination of Facebook’s algorithms, its moderators half a world away with limited context or knowledge of the conflict and the sheer volume of content ─ as well as of those reporting it ─ determining what is allowed to stay put.

A prime case, perhaps, of a chain of command following orders missing the forest for the trees.

Opinion

Editorial

IMF’s projections
Updated 18 Apr, 2024

IMF’s projections

The problems are well-known and the country is aware of what is needed to stabilise the economy; the challenge is follow-through and implementation.
Hepatitis crisis
18 Apr, 2024

Hepatitis crisis

THE sheer scale of the crisis is staggering. A new WHO report flags Pakistan as the country with the highest number...
Never-ending suffering
18 Apr, 2024

Never-ending suffering

OVER the weekend, the world witnessed an intense spectacle when Iran launched its drone-and-missile barrage against...
Saudi FM’s visit
Updated 17 Apr, 2024

Saudi FM’s visit

The government of Shehbaz Sharif will have to manage a delicate balancing act with Pakistan’s traditional Saudi allies and its Iranian neighbours.
Dharna inquiry
17 Apr, 2024

Dharna inquiry

THE Supreme Court-sanctioned inquiry into the infamous Faizabad dharna of 2017 has turned out to be a damp squib. A...
Future energy
17 Apr, 2024

Future energy

PRIME MINISTER Shehbaz Sharif’s recent directive to the energy sector to curtail Pakistan’s staggering $27bn oil...