Facebook monitoring Board is often described as a “Supreme Court” for Facebook. On Wednesday, he acted like this: he made a very fine decision that pushes back the most difficult question put to him for Mark Zuckerberg.
The problem before the board, in case you hadn’t turned on the news or checked Twitter this week, was whether to respect Facebook indefinite ban Donald Trump’s account of his role in inciting the January 6 riot on Capitol Hill. It was, by far, the most anticipated decision of the young existence of the Supervisory Board. Since the company referred the matter to the board of directors on Jan.21, it has received more than 9,000 public comments on the matter. On Wednesday, Trump’s ban remains in place – but the decision is still not final.
More specifically, Facebook asked the Supervisory Board to decide:
Given Facebook’s values, in particular its commitment to voice and security, did it correctly decide, on January 7, 2021, to prohibit Donald J. Trump from accessing the publication of content on Facebook and Instagram indefinitely?
The council’s response was yes – and no. Yes, Facebook was right to suspend Trump’s account; no, it was a mistake to do it indefinitely. “By applying a vague and non-standard sanction and then sending this matter back to the board for resolution, Facebook is seeking to shirk its responsibilities,” the board wrote. decision. “The Council denies Facebook’s request and insists that Facebook apply and justify a defined penalty.” In other words, Facebook must decide whether to let Trump return immediately, set a clear end date for his suspension, or kick him from its platforms forever.
While the board criticized Facebook for refusing to take a clearer stance, it also endorsed the immediate logic of the withdrawal. the original decision to deactivate Trump’s account was done under extraordinary circumstances. While the violent attack on the United States Capitol still rages on, Trump released a series of messages, including a video, in which he told his supporters to return home – but in which he also repeated the false claim that the election was stolen, the very idea motivating his supporters riots. “It was a fraudulent election, but we cannot play these people’s game,” he said in the video. “We must have peace. So go home. We love you. You are very special. The next day, Facebook had removed the posts and suspended Trump entirely from its platform, as well as Instagram and WhatsApp. (Twitter and YouTube did the same.)
It was clear from the start that the content of the offending posts was far from Trump’s most egregious – after all, he was at least telling rioters to go home – and clearly did not violate any clear rules. Trump had been using Facebook to spread the myth of the stolen election for months, after all. What had changed, therefore, was not Trump’s online behavior, but his offline consequences. In one blog post Explaining Facebook’s decision, Mark Zuckerberg tacitly acknowledged. “We deleted these statements yesterday because we ruled that their effect – and probably their intention – would be to provoke further violence,” he wrote. While the platform previously tolerated Trump, “the current context is now fundamentally different, involving the use of our platform to incite a violent insurgency against a democratically elected government.” Trump would remain banned “indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”
The move was a stark change from Facebook’s normal approach to moderation in two ways. First, the company explicitly examined not only the content of the articles, but the actual context as well. Second, it has deviated from its “media value” rule which generally gives political leaders extra leeway to break the rules, based on the theory that people deserve to know what they have to do. say.