Who decided to uphold Trump’s Facebook ban?

The Facebook Oversight Board deals with free expression on the platform

The Facebook Oversight Board must help Facebook decide what content to take down and what to leave and up.  Photograph: iStock

The Facebook Oversight Board must help Facebook decide what content to take down and what to leave and up. Photograph: iStock

 

Facebook’s Oversight Board has decided Donald Trump’s ban from the social media platform, which was handed down in January, should stand. Facebook needs to review the “indefinite” suspension in six months, however, and possibly delete the accounts.

But who exactly is making these decisions?

What exactly is the Facebook Oversight Board?

The group has a specific purpose: to help Facebook decide what content to take down and what to leave and up.

It was announced in 2018 by Facebook chief executive Mark Zuckerberg, and its founding members were named in May 2020.

The board has two main functions. It issues advisory opinions on Facebook’s content policies, and it also issues binding decisions on specific pieces of content. Facebook has to stick by the decisions the board makes on the latter, unless to do so would be illegal.

Who is on it?

Currently, the website for the Oversight Board lists 20 board members, but when fully staffed it will consist of 40 members. Among those making decisions are human rights experts, former heads of media, and legal experts from all over the world. Former editor-in-chief of the Guardian newspaper Alan Rusbridger, PEN America chief executive Suzanne Nossel and Nighat Dad, the founder of the Digital Rights Foundation, are three of the founding members.

How does it work?

Funded by Facebook, the Oversight Board is designed to deal with the questions of free expression and human rights on Facebook and Instagram, dealing with what it refers to as “emblematic cases”. In other words, it is not going to get involved in a dispute between you and your aunt about those awful childhood photos she insists on uploading and tagging you in.

If you feel content has been removed unfairly, you can appeal your case to the board after exhausting Facebook and Instagram’s review process. In all, more than 150,000 cases have been appealed to the board, but it prioritises cases that have the most impact on users, are important to public discourse or raise important questions about Facebook’s policies.

Is it really independent?

The board has its critics, who believe that its powers are too limited and won’t tackle the real issues at the heart of social media.

That may be true – the board can only work with cases that are referred to it, for example, rather than proactively searching out incidents. But neither has it backed all of Facebook’s decisions on content. In its first five decisions, the board overturned four of Facebook’s original judgments. So take from that what you will.

What decisions has it made in the past?

Among the cases appealed to the board was the removal of photographs for a breast cancer awareness campaign, which Facebook later reinstated but the board ruled on anyway; another case involved content criticising the lack of authorisation for hydroxychloroquine and azithromycin to treat Covid-19. Although Facebook removed the post under its rules on spreading misinformation around Covid-19, the board reversed that decision and said Facebook should correct the misinformation rather than remove it.

The board has also dealt with appeals over content removed for hate speech, violence and incitement, and dangerous individuals and organisations.

And now?

The board published its decision on the indefinite suspension of Donald Trump from the social media platform on Wednesday, upholding the social network’s decision to suspend his accounts. Facebook took action against the 45th president of the US after the storming of the Capitol on January 6th, removing his accounts from Instagram and Facebook. That case was referred to the board by Facebook.

However, there was also a slap on the wrist for the social network, due to the “indefinite” nature of the punishment. Facebook’s own rules don’t allow for an indefinite suspension, so within six months the company must review the matter, decide on a new time-limited punishment that reflects its rules and the severity of the matter, or delete the accounts. “Facebook cannot make up the rules as it goes, and anyone concerned about its power should be concerned about allowing this,” the board said,posting the decision to Twitter.