Facebook’s Supervisory Board will soon be making its most important decision yet: whether or not Donald Trump’s “indefinite suspension” from Facebook and Instagram should be lifted.
The move will be the biggest test yet for the Supervisory Board, Facebook’s most ambitious attempt to prove it can self-regulate. The Trump decision will also likely shape public perception of the organization, which has so far made less than a dozen decisions.
But the Supervisory Board, which has been widely described as “Facebook’s Supreme Court,” was set up to deal with more than just Trump. The Facebook-funded organization is meant to help the social network navigate its most complicated and controversial decisions around the world. It could also end up influencing wider Facebook policies – if the company allows it.
Facebook’s “ Supreme Court ”
The board itself has only been functional for less than a year, although the organization is actually going back up. It was then that a Harvard professor and longtime friend of Facebook COO Sheryl Sandberg allegedly proposed that Facebook create some sort of “Supreme Court” for its most controversial content moderation rulings. This idea formed the basis of what we now call the Supervisory Board.
According to Facebook, the Supervisory Board is supposed to be completely independent. But the social media company provided the initial funding of $ 130 million – supposed to last for six years – and helped choose board members. Throughout the process, Mark Zuckerberg was “heavily involved in the creation of the board of directors”, The New Yorker in its report on the origins and beginnings of the Supervisory Board.
On the other hand, the Supervisory Board has done everything possible to emphasize its independence. Its head of public policy, Rachel Wolbers, even recently suggested that the board may one day weigh in on content moderation decisions for other platforms. “We hope we’re going to do such a good job that other companies might need our help,” she said during an appearance at SXSW.
So far, the board has 19 members from all over the world (there were originally 20, but one remained in February at the Department of Justice). Eventually it will drop to 40, although this will allow the exact number to “increase or decrease in size as appropriate.”
Its first Alan Rusbridger, former editor of The Guardian; Helle Thorning-Schmidt, former Prime Minister of Denmark; and John Samples, vice president of the Cato Libertarian Institute. All members “have experience or are human rights advocates” on the board of directors. And all members receive for their part-time work with the organization.
However, unlike the current Supreme Court, the Supervisory Board comes with term limits. Members are limited to three terms of three years.
Functioning of the Supervisory Board
Facebook deletes thousands of posts every day, but only a tiny fraction of those deletions will ever become official Supervisory Board cases. For those who do, there are several ways a case can get to the board.
When Facebook removes a post for breaking its rules, users have the option to appeal the decision. Sometimes these calls cause Facebook to overturn its decision. But in cases where Facebook chooses do not to restore an element of content, users have the option of addressing the Supervisory Board as a last resort. Again, making an appeal does not guarantee that the commission will take up the matter. Out of more than 300,000 calls received, only 11 cases were selected.
This week, Facebook announced that it would expand the types of content the forum could weigh in by allowing users to make a different kind of appeal. Instead of contesting the content that Facebook had deleted, users will now have the option to appeal content that the company has chosen.
As part of this procedure, users must first go through Facebook’s reporting process. If the company ultimately decides to leave the posted publication, it will alert the user who reported it, with a reference identifier that allows it to appeal to the Supervisory Board. A notable difference from the resignation appeal process is that the same position can be appealed in these “leave” cases.
Finally, Facebook’s policy makers also have the ability to convey “important and difficult” decisions directly to the board without waiting for any appeals process to be in place. Trump’s suspension was one of them. But the company brought a case involving health disinformation about COVID-19 to the board, which ultimately overturned the decision to remove a post criticizing the French government over COVID-19 treatments.
Once the board has made a decision, Facebook is obligated to implement it. The company has done everything possible to stress that no one is in a position to dismiss the Supervisory Board. At the same time, Facebook is only required to implement the decisions of the board of directors on the specific cases it decides on, although the company says it will endeavor to apply the decision to “content. identical in a parallel context ”.
Still, the board can exert some influence over the underlying policies of the social network – at least in theory. In addition to each withdrawal / abandonment decision, the board of directors also looks at the rules of the company and makes its own suggestions. Facebook is obligated to respond to these recommendations but, more importantly, is not obligated to follow its advice.
So while the board may wield considerable power in specific cases, such as the upcoming Trump decision, Facebook always has the final say in its own policies. This has led to advocacy groups and other organizations saying that a board charged with “oversight” should also be able to influence other big issues, like Facebook’s ad policies and algorithms.
What he has done so far
The board has only ruled on seven cases and Facebook’s initial decisions in five of them. (There has been that the board may be inclined to revive Trump’s account, but so far they have given no indication of how they will vote.)
Tellingly, the board has called some of Facebook’s content policies “mistakenly vague” or “clear enough to users.” And many of its initial recommendations to Facebook encouraged the company to communicate more clearly with users. Likewise, the board has shown some skepticism about Facebook’s use of automation in moderation decisions, and said users should know when a post gets deleted as a result of automated detection.
However, the influence the council will have on more general policies is less clear. Facebook recently sent its first set of policy recommendations to the Supervisory Board, and its commitments were somewhat tenuous. In a few areas, the company has made notable changes. For example, he agreed to clarify the nudity policy for Instagram, and he chose to better explain his policies on misinformation about vaccines.
In other areas, Facebook’s responses were more cautious. The company made several vague commitments to increase “transparency,” but offered few details. In response to other recommendations, the company simply said it was “assessing the feasibility” of the changes.
Regarding the Trump decision, the board has suggested that it will also weigh on Facebook’s policies. But again, there is no obligation for the social network to implement changes.
What we do know is that the board is already treating the Trump decision differently from other cases. Within days of the original 90-day deadline, the board announced it did, citing the more than 9,000 public comments it had received. The decision is now expected “in the coming weeks”.