How powerful is Facebook’s ‘Supreme Court’ for speech?

Published May 12, 2020
Oversight board will issue rulings on what kind of posts will be allowed and what should be taken down. — AP
Oversight board will issue rulings on what kind of posts will be allowed and what should be taken down. — AP

KARACHI: Last week, Facebook appointed 20 people from around the world to serve on what will effectively be the social media network’s “Supreme Court” for speech, issuing rulings on what kind of posts will be allowed and what should be taken down.

The list features a former prime minister, a Nobel Peace Prize laureate, and several constitutional law experts and rights advocates, including the Pakistani lawyer and founder of Digital Rights Foundation (DRF), Nighat Dad.

The creation of an oversight board by a social media company is not only a first for internet regulation, but also for Pakistan as the country is now on the global tech map with Ms Dad’s inclusion.

While it has set a new model for accountability on content management, to what extent will it shape the company’s policies?

Selection of cases

The board will give users a chance to appeal against any wrongful removal of their content on Facebook and Instagram. Although, the board will only review a fraction of those appeals as a user must first exhaust Facebook’s appeals before they can involve the board.

According to the board’s leadership, the panel will focus on the “most challenging” content issues for Facebook, including in areas such as hate speech, harassment and protecting people’s safety and privacy.

Oversight board will issue rulings on what kind of posts will be allowed and what should be taken down

Besides user appeals, it will also be able to hear cases that have been referred by Facebook. Facebook will directly refer cases to the board that are significant and difficult.

Significant, as defined in the bylaws, means that the content in question involves real-world impact and issues that are important for public discourse.

Difficult means the content raises questions about current policies or their enforcement, with strong arguments on both sides for either removing or leaving up the content under review. The board has sole discretion to accept or reject cases that are referred through this process.

Facebook has long faced criticism for high-profile content moderation issues, including removal of pro-Kashmir posts, hate speech in Myanmar against the Rohingya and other Muslims.

Recently, the company included guidelines on pandemic content to its list of community standards. However, with the platform now relying largely on automated moderation, anti-vaccine activists and conspiracy theorists have already become adept at gaming the platform’s rules.

Unless its policies and moderation improve, the board is less likely to bring about major change as the final decision will be in accordance with Facebook’s community standards.

Another challenge limiting its efforts is the global scale at it which it operates. Facebook said the board members chosen collectively have lived in more than 27 countries and speak at least 29 languages.

Globally, there are 2.5 billion people using the platform in more than 100 languages.

Regulation in Pakistan

It is important to mention that not all content can be submitted to the board for its review.

The board’s decisions will be binding “unless implementation could violate the law”, Facebook said.

This is the sole reason why the board’s addition is less likely to change much for internet regulation in countries with repressive cyber laws, such as Pakistan.

During the first half of 2019, Pakistan reported the highest volume of content (31 per cent) to Facebook.

In its transparency report, Facebook said it restricted 5,690 items within Pakistan. None of the 5,690 items from the Facebook’s transparency report were removed for violating its content policies but under Pakistan’s cybercrime law.

The government has also introduced the Online Citizens Protection (Against Online Harm) Rules, 2020.

Under the new rules, social media platforms will be required to remove any ‘unlawful content’ pointed out to them in writing or electronically signed email within 24 hours, and in emergency cases within six hours. With the online harm rules in effect, if for instance, the authority now specifies 2,000 items to Facebook for removal, the platform will be required to fully comply with it.

“Ultimately, Facebook has to respect local law in every country it operates in, so governments are free to introduce laws and Facebook, and the board, would have to follow those laws,” a spokesperson for the overview board told Dawn.

Account suspensions not included

Initially, the board will only review individual pieces of content, such as specific posts, photos, videos and comments on Facebook and Instagram.

The scope will expand in the future to include other kinds of content, for example content that has been left up, as well as pages, profiles, groups or events.

Last year, Facebook removed 103 pages, groups and accounts on both Facebook and Instagram as part of a network that originated in Pakistan. In a blogpost on the takedown, Facebook said it had found that the network was linked to employees of Pakistani military.

The spokesperson said the accounts and pages removed over ‘coordinated inauthentic behaviour’ on Facebook will not be reviewed by the panel for now as Facebook already partnered with “independent people” to review and document its CIB enforcement actions and the results were an outcome of weeks or months of investigations by its teams.

Dad’s role

According to the board, members do not represent individual countries when making decisions.

Each case identified by the board’s case-selection committee will be assigned to a five-member panel, four picked randomly from the board at large and one “from among those board members who are from the region which the content primarily affects”.

“A five-member panel will deliberate over a case of content implicating Pakistan would include at least one board member from Central and South Asia, though this may not necessarily be Nighat Dad,” a board spokesperson told Dawn.

As part of vetting, new board members (including Ms Dad) are required to disclose any potential conflicts of interest, the board added.

Regarding Ms Dad’s advocacy in Pakistan, the spokesperson said the DRF founder will no longer be advocating with Facebook directly to take specific policy positions and will also not have an avenue for escalating content to the company as a digital rights activist, it said.

“That said, others at Digital Rights Foundation (DRF), will remain engaged with Facebook — completely separate from Nighat and her work on the Oversight Board,” the OB representative added.

Published in Dawn, May 12th, 2020

Opinion

Editorial

IMF’s projections
Updated 18 Apr, 2024

IMF’s projections

The problems are well-known and the country is aware of what is needed to stabilise the economy; the challenge is follow-through and implementation.
Hepatitis crisis
18 Apr, 2024

Hepatitis crisis

THE sheer scale of the crisis is staggering. A new WHO report flags Pakistan as the country with the highest number...
Never-ending suffering
18 Apr, 2024

Never-ending suffering

OVER the weekend, the world witnessed an intense spectacle when Iran launched its drone-and-missile barrage against...
Saudi FM’s visit
Updated 17 Apr, 2024

Saudi FM’s visit

The government of Shehbaz Sharif will have to manage a delicate balancing act with Pakistan’s traditional Saudi allies and its Iranian neighbours.
Dharna inquiry
17 Apr, 2024

Dharna inquiry

THE Supreme Court-sanctioned inquiry into the infamous Faizabad dharna of 2017 has turned out to be a damp squib. A...
Future energy
17 Apr, 2024

Future energy

PRIME MINISTER Shehbaz Sharif’s recent directive to the energy sector to curtail Pakistan’s staggering $27bn oil...