Join us
Tow Center

Facebook’s Oversight Board plays it safe

December 3, 2020
Adobe Stock

Sign up for The Media Today, CJR’s daily newsletter.

This week saw the long-awaited public debut of Facebook’s Oversight Board, a group of twenty eminent lawyers, human rights experts, politicians, and journalists who superintend an appeals process for those who wish to have posts that have been removed by Facebook reinstated. The Oversight Board announced the first six cases it would hear, a small drop in a vast ocean of millions of moderation decisions taken by the platform. Despite the announcement happening on the heels of a highly contentious period for Facebook, weeks after it publicly struggled to adjust flagging and takedown procedures amid a US election dominated by disinformation, the chosen cases stay clear of any US calls altogether while overlooking far more contentious incidents of political speech suppression around the world.

The Oversight Board is being sold as a brain trust of impressive people who will contemplate the “hard cases” of expunged content. While the daily reality of content moderation resides in vast numbers of underpaid workers screening out traumatic content at an overwhelming scale, the public image of the operation is now that of lofty and well-intentioned intellectual deliberation around nudity and Nazis.

The remit of the Oversight Board is designed to be restrictively narrow: The board can only take up appeals against removal of content (meaning it cannot look at cases where disputed material is left up rather than taken down), and it cannot review cases that are not appealed. Despite Facebook’s alleged commitment to transparency, confidentiality is a key priority for the company when it comes to the inner workings of the Oversight Board. Members of the board cannot discuss their work except through authorized public relations channels. 

In fact, rather surprisingly, the board’s code of conduct singles out contact with government officials: “Board members, staff, and their immediate family will not interact with government officials (including the immediate family members of government officials) regarding their service on the board and/or the cases that they are reviewing.” This might be a necessary condition to guarantee independence from government interference, but it also seems to preclude the possibility of Oversight Board members speaking to regulatory committees, elected officials, or members of government who have a legitimate interest in issues of speech and human rights.  

Starting this week, the members of the board have ninety days to reach conclusions in six inaugural cases (selected out of twenty thousand), providing guidance for similar dilemmas as they go. Among the questions they are contemplating are: Should a set of Instagram posts that showed nipples in contravention of Facebook’s no-nudity policy be allowed, considering the photos were part of a breast cancer prevention campaign? Should historical quotes from Joseph Goebbels be allowed to circulate even if they are in violation of Facebook’s “Dangerous Individuals and Organizations” policy? Should screenshots of tweets by the Malaysian prime minister stay up in order to raise awareness of the hateful nature of his bigotry, or be removed because they violate Facebook’s hate-speech policy? Should posts criticizing the French government be reinstated despite their allusions to “alternative treatments” for covid?  

There were no conservative politicians, QAnon conspiracists, or anti-vaccine activists in the selected cases, meaning either that none complained about deletions or that the board wants to avoid clickbait cases in its first set of deliberations. 

Sign up for CJR’s daily email

In today’s information ecosystem, technology platforms like Facebook are not just the arbiters of truth; they are also the setters of norms, the weather vanes of taste, and the guardrails of democracy. And, in an increasing number of places, they are the instruments of oppression. As such, perhaps the most striking feature of the board’s first set of cases is the lack of ambition in their subject matter.

If a panel of global experts really needs three months to decide if it is acceptable to show a naked boob in pursuit of cancer prevention, then the Oversight Board’s hope of creating lasting impact is doomed from the outset. Issues of contextual nuance might represent interesting cases, but they are not “hard” in the way that, say, the mass removal of posts in compliance with repressive speech laws is hard. Yet cases concerning the latter are unlikely ever to reach the Oversight Board. In fact, in the board’s charter, Article 2, on the “scope” of the board’s activities, states: “In limited circumstances where the board’s decision on a case could result in criminal liability or regulatory sanctions, the board will not take the case for review.” In other words, if a removal is in compliance with the law of a country, then it will not be reviewed. 

On the same day the Facebook Oversight Board launched, Amnesty International published a damning report on how aggressive new censorship laws in Vietnam are stifling citizens, the free press, and activists—with the compliance of technology companies like Google and Facebook. Facebook reports a 983 percent increase in content restrictions in Vietnam since the tightening of laws in April, pushing the number of restricted and deleted posts up from 77 to 834 in the space of a year. 

The disappearance of hundreds of Facebook posts by freelance journalist Truong Chau Huu Danh (who is verified by Facebook and has 150,000 followers) is a “hard case.” But cases like this will not get the benefit of the considerable collective wisdom of the Oversight Board. For Facebook, the alternative to breaching restrictive speech laws would be to withdraw from countries that enforce them. In Vietnam, this would mean putting $1 billion of annual revenue at risk. 

Perhaps the Oversight Board directors will push the boundaries of their remit to include substantial free-speech dilemmas like this. However, the hardest decisions for Facebook will not be made by a toothless advisory council, but by the operational board of the company itself. Moderation decisions are at the heart of Facebook’s business. How long they can be held at arm’s length is uncertain.

THE MEDIA TODAY: Facebook’s Supreme Court starts to hear its first cases

Has America ever needed a media defender more than now? Help us by joining CJR today.

About the Tow Center

The Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, a partner of CJR, is a research center exploring the ways in which technology is changing journalism, its practice and its consumption — as we seek new ways to judge the reliability, standards, and credibility of information online.

View other Tow articles »

Visit Tow Center website »