OVER the last two decades, accountability of private social media companies has emerged as one of the defining, and most divisive, issue of our times. Powerful tech companies have powerful implications for issues such as free speech, discrimination, access, and wealth distribution. Lawmakers and civil society across the world have introduced a number of interventions to keep this power in check: through stronger anti-competition regulation, demanding greater transparency and compliance with international human rights standards.
Facebook’s newly created Independent Oversight Board is one such experiment to ensure greater accountability for private companies, designed to tackle some of the most complex and critical content moderation issues on Facebook and Instagram. Consisting of 20 members from across the globe and the spectrum of political opinion, it is an attempt to engage with criticism that the company has faced in the past and to implement UN guidelines on best practices for businesses in upholding human rights.
Some of the board members, on a number of occasions have been critical of social media companies, and Facebook in particular. These companies have accumulated a vast amount of power, and as we live through a truly global health crisis, the challenges posed by misinformation on social media platforms are clear for all to see. We are at a crossroads with social media and the place of private companies in our lives. Either we commit to working together, bringing together different voices and perspectives from around the world, to find a new model of content governance on social media that respects freedom of expression and aligns with international human rights norms, including the promotion of the voices of women and minority groups; or we sit by and watch as the online space becomes ever more lawless, enabling harassment, hate speech, and a gradual descent towards chaos.
A new board created by Facebook will tackle content moderation.
The Oversight Board is an attempt to try to address content moderation in a meaningful way. Sometimes the seemingly simplest questions are the most difficult to answer: what content should stay up? What should come down? Who is best placed to decide? And fundamentally, how do we develop rules and norms that answer these questions while respecting different cultures, values, and discourse.
These are the issues that the Oversight Board is tasked with addressing. However, there is understandable skepticism as to whether it can answer these questions or meaningfully change how Facebook moderates content and the decisions they make. Furthermore, the board only seeks to tackle the issue of content moderation, while other criticisms such as monopolising power and discriminatory practices of internet companies still need to be effectively addressed. The board is also an exercise in innovation and it will take time to understand what works and what needs to improve. But, critically, it is not working for Facebook and its members are not employed by Facebook. The independence is amplified by the fact that the decisions are binding, which means Facebook has to implement them.
The board will prioritise cases that are the most significant and have real-world impact by initially hearing appeals from decisions about content that is removed from Facebook and Instagram and, in the near future, it will hear appeals about content that has been left up by Facebook. It will also make recommendations on policy and Facebook will be bound to respond publicly to these recommendations.
Collectively, as more and more societies become dependent on digital platforms, the task facing us is to build independent mechanisms to adjudicate over content moderation disputes. This is no easy task. There is no blueprint for this and thoughtful interventions will be needed to maintain the independence of the board, as it seeks to speak truth to power as opposed to rubber stamping private power.
The global make-up of the board is also reason to be hopeful. While no single body can hope to represent each and every viewpoint of the more than two billion Facebook users, representation and constant reflection can ensure that there is diversity and respect for the views of others, and the right to have those views heard, guide our decision-making.
In the past, Facebook has faced rightful criticism on the issue of women’s safety in online spaces, and the board needs to adopt approaches to free speech and content moderation that take structural oppressions and inequalities into account while making critical decisions. Oversight board plans to expand the membership of the board to 40, and it is particularly important that the views of people in the Global South, not just the Global North, are heard, considered and respected as we seek to develop a new approach to content moderation and perhaps, in time, set a precedent and a model for other social media companies to follow.
The writer is a member of Facebook’s Independent Oversight Board.
Published in Dawn, May 11th, 2020