Facebook CEO Mark Zuckerberg testifies before Congress in October 2019.


Getty Images

Facebook‘s content oversight board ruled Wednesday that the world’s largest social network was justified in suspending Donald Trump amid concerns the former president could incite violence, but took exception with the open-ended nature of the penalty.

The board, which is tasked with reviewing some of Facebook’s most difficult content decisions, found the social media giant should keep Trump’s suspension in place but reconsider the length of time he was barred from the social network. The board told Facebook to complete its review within six months. 

“Trump’s posts during the Capitol riot severely violated Facebook’s rules and encouraged and legitimized violence,” the board tweeted about its decision. “The Board also found Facebook violated its own rules by imposing a suspension that was ‘indefinite.’ This penalty is not described in Facebook’s content policies. It has no clear criteria and gives Facebook total discretion on when to impose or lift it.”

The Board has upheld Facebook’s decision on January 7 to suspend then-President Trump from Facebook and Instagram. Trump’s posts during the Capitol riot severely violated Facebook’s rules and encouraged and legitimized violence. https://t.co/veRvWpeyCi

— Oversight Board (@OversightBoard)
May 5, 2021

The Board also found Facebook violated its own rules by imposing a suspension that was ‘indefinite.’ This penalty is not described in Facebook’s content policies. It has no clear criteria and gives Facebook total discretion on when to impose or lift it.

— Oversight Board (@OversightBoard)
May 5, 2021

The former president was kicked off Facebook and its Instagram photo-sharing service in the wake of the deadly Capitol Hill riot on Jan. 6. Other social networks, including Twitter, also took action against Trump, who used their services to fan doubt over the legitimacy of the 2020 presidential election.

The case has been highly watched because it highlights the difficult balance private social media companies need to strike when handling political speech by public figures. Facebook CEO Mark Zuckerberg made the unprecedented decision to ban Trump, while he was still in office, after he whipped up supporters as Congress gathered to certify Joe Biden’s election as president. The risks of allowing Trump to continue posting were “simply too great,” the Facebook boss said at the time.

CNET Daily News

Stay in the know. Get the latest tech stories from CNET News every weekday.

Other social media networks, including Snapchat and Google-owned YouTube, have taken action against Trump to varying degrees. Twitter has permanently banned Trump from its platform. 

Facebook, which requested the oversight board review, followed the board’s decisions on its first slate of cases, which involved hate speech, incitement of violence and other thorny topics. The board is funded by Facebook but is described as independent.

Read more: Here’s how you can submit an appeal to Facebook’s oversight board.



Now playing: Watch this: Facebook getting an oversight board
4:30

Critics of Facebook, which was used by Russia to influence the 2016 presidential election, say it isn’t taking its responsibility seriously enough and don’t think the oversight board moves fast enough or goes far enough. A group of vocal critics has set up a shadow organization, which it calls the Real Facebook Oversight Board.

The group had urged Facebook’s oversight board to keep the Trump ban in place. In a post earlier this week, the group said Trump should be “banned forever” for violating the social network’s rules on hate speech and spreading disinformation. The group characterized the Trump decision, which hadn’t yet been released, as a public relations “stunt” meant to deflect from the social network’s harm. 

Here’s what you need to know about Facebook’s oversight board:

Sounds like this board will have a lot of responsibility. What can it do?

Let’s get something straight: The oversight board doesn’t do the same job as content moderators, who make decisions on whether individual posts to Facebook comply with the social network’s rules. The board exists to support the “right to free expression” of Facebook’s 2.85 billion users.  

The board functions a lot like a court, which isn’t surprising given that a Harvard law professor came up with the idea. Users who believe content moderators have removed their posts improperly can appeal to the board for a second opinion. If the board sides with the user, Facebook must restore the post. Facebook can also refer cases to the board. 

The oversight board can also make suggestions for changes to Facebook’s policies. Over time, those recommendations could affect what users are allowed to post, which could make content moderation easier. 

Why does Facebook need an oversight board in the first place? 

Facebook gets criticized by just about everybody for just about every decision it makes. Conservatives say the company, and the rest of Silicon Valley, are biased against their views. They point to bans of right-wing provocateurs Alex Jones and Milo Yiannopoulos to support their case. 

The social network doesn’t get much love from progressives, either. They complain Facebook has become a toxic swamp of racist, sexist and misleading speech. Some progressive groups underlined their concerns last summer by calling on companies to avoid advertising on Facebook and publicizing the boycott with the hashtag #StopHateForProfit.

More about Facebook

  • Facebook can see your web activity. Here’s how to stop it
  • How to completely delete your Facebook account, loose ends and all
  • Facebook remains top target for disinformation, study says
  • Can Facebook be broken up? What you need to know

The oversight board can help Facebook deal with those complaints while lending credibility to the social network’s community standards, a code of conduct that prohibits hate speech, child nudity and a host of other offensive content. By letting an independent board guide decisions about this content, Facebook hopes it’ll develop a more consistent application of its rules, which in the past have generated complaints for appearing arbitrary. 

One example: Facebook’s 2016 removal of an iconic Vietnam War photo that shows a naked girl fleeing a napalm attack. The company defended the removal, saying the Pulitzer Prize winning image violated its rules on child nudity. It reversed its decision shortly afterward as global criticism mounted. 

Got it. But why does Facebook need an independent organization? 

It’s no secret that Facebook has a trust problem. Regulators, politicians and the public all question whether the decisions the company makes serve its users or itself. Making the board independent of Facebook should, the company reckons, give people confidence that its decisions are being made on the merits of the situation, not on the basis of the company’s interests. 

OK. So who has Facebook chosen to be on this board?

Last year, Facebook named the first 20 members of the board, a lineup that includes former judges and current lawyers, as well as professors and journalists. It also includes a former prime minister and a Nobel Peace Prize winner. The board could eventually be expanded to 40 people. The members have lived in nearly 30 countries and speak almost as many languages. About a quarter come from the US and Canada.

Serving on the board is a part-time job, with members paid through a multimillion-dollar trust. Board members will serve a three-year term. The board will have the power to select future members. It’ll hear cases in panels of five members chosen at random. 

Trump and conservatives were unhappy with the makeup of the board, which they saw as too liberal, according to The New Yorker. The former president even called Zuckerberg to express this sentiment, but Facebook didn’t change the board members.

Wait a minute. Facebook is paying the board? Is it really independent?

If you’re skeptical, we hear you. Facebook doesn’t have a great reputation for transparency.

That said, the charter establishing the board provides details of the efforts Facebook is taking to ensure the board’s independence. For example, the board isn’t a subsidiary of Facebook; it’s a separate entity with its own headquarters and staff. It maintains its own website (in 18 languages, if you count US and UK English separately) and its own Twitter account.

Still, when it comes to money, the board is indirectly funded by Facebook through a trust. Facebook is funding the trust to the tune of $130 million, which it estimates will cover years of expenses. 

Facebook says it’ll abide by the board’s decisions even in cases when it disagrees with a judgment. (The social network says the only exceptions would be decisions that would force it to violate the law, an unlikely occurrence given the legal background of many board members.)

The board will also try to keep Facebook accountable, publishing an annual report that’ll include a review of Facebook’s actions as a result of its decisions. 

Internet Culture
Hard News
Google
Snapchat
Twitter
YouTube
Notification on Notification off Facebook