Donald Trump wants back on Facebook, and he’s willing to go to court over it. Except in this case, it’s a strange kind of judicial authority: the Facebook Oversight Board, a Facebook-funded body that promises to independently adjudicate cases of import to the platform’s rules and standards. (Lately, it’s been debating the acceptability of nipples appearing in photos about breast cancer.) According to The New York Times, the Oversight Board—composed of a distinguished roster of scholars, activists, and politicians—will decide Trump’s Facebook fate in the coming months. Whatever decision it makes is binding.
As the Times’ Ben Smith summarized it: “The decision has major consequences not just for American politics, but also for the way in which social media is regulated, and for the possible emergence of a new kind of transnational corporate power at a moment when almost no power seems legitimate.”
If Facebook is serious about applying its rules—some of which are published, some of which aren’t—to all of its users, then the company was right to suspend Trump’s account, which had become a purveyor of disinformation and violent incitement long before the January 6 Capitol riot. But Smith’s point also stands: In the absence of any definitive governmental authority, an organization that, its critics charge, offers only a fig leaf of independent judgment is going to decide far-reaching standards that could affect billions of people’s speech rights. It’s an awesome—and potentially authoritarian—responsibility for any entity, much less a globe-spanning tech company controlled by one person.
On the corporate side, Facebook has portrayed itself as improbably reluctant to wield so much power—a sort of heavy-is-the-head-that-wears-the-crown, woe-is-me posture. “Many are uncomfortable with the idea that tech companies have the power to ban elected leaders,” tweeted Nick Clegg, the British politician turned Facebook communications executive. “We agree. These decisions should be made according to frameworks agreed by democratically accountable lawmakers. But in the absence of such laws, there are decisions we cannot duck.”
It’s hard to shake a sense of disbelief about these comments. It’s possible that Facebook would welcome a light regulatory framework for helping decide these cases, but the company has also spent millions currying influence in Washington in order to avoid antitrust and other regulatory actions. It may now hope that publicly calling for some higher power to step in will help it avoid a more severe fate, like a company breakup. A similar motivation underlies the establishment of the Oversight Board: It’s a way to launder Facebook’s responsibility for these issues, to pass the buck to someone else, whether a board of judges or the law itself.
The Real Facebook Oversight Board, an organization launched by Observer journalist Carole Cadwalladr that counts a number of prominent Facebook critics among its members, has denounced its counterpart as a tool of Facebook’s executive leadership. It’s also criticized the timing, complaining that Trump was only banned after helping to incite a deadly uprising.
In a statement, the Real Facebook Oversight Board said, “Whether or not Trump is banned for good, the real question needs to be: What is Facebook doing to keep hateful and violent content off their platforms to begin with?”
The Trump case may be an important precedent, if nothing else, in establishing the legitimacy of Facebook’s court system. But it also risks obscuring, as the Real Facebook Oversight Board suggests, the many other abuses of the Facebook platform, including by authoritarian leaders abroad. Writing in the tech publication Rest of World, Alaphia Zoyab pointed out that several Indian politicians have been found, in an official government investigation, to be responsible for inciting deadly anti-Muslim violence. Yet they retain their Facebook accounts, and their ability to use it as a megaphone to spread incitement and bigotry. “If these companies can silence a sitting United States President,” asked Zoyab, “why can’t they curb politicians with far less power?”
One objection to all this is that it constitutes a new form of censorship. Undoubtedly this is a tangled skein of issues, ranging from corporate rights to regulate speech to the importance of politicians communicating with constituents. There are overlapping, sometimes competing interests at play. But one early study has found that social media misinformation plummeted after Trump’s ban—a result of a disinformation “superspreader” being put out of business. What’s more is that years of reporting have shown that Facebook in fact devotes too few resources to content moderation; that it lacks language specialists in countries like Burma and Sri Lanka that have been host to political violence; and that its tendency to delete material at governments’ request most affects journalists and democracy activists in places like Turkey, the Philippines, and Palestine.
A reframing is in order. Should he desire, Trump will have access to as much media coverage as he wants for as long as he lives. He merely has to pick up the phone. What he won’t have is the ability to use Facebook to commit harm at scale, to impinge on the speech rights of others by fostering a threatening climate of extremism and disinformation. As for Trump’s personal rights, there is as yet no legal guarantee to have a Facebook account. And while there will be missteps and overreach and Facebook’s usual bungling from crisis to crisis, the platform is a better place if a few of its most egregious offenders experience consequences for their actions.
This could all be undone by the Facebook Oversight Board’s eventual ruling. Its Facebook funding alone makes the board’s independence seem like a facade, as its critics claim. (Board members get paid six figures for about 15 hours of work per week, according to the Times.) But beyond its original suspension of Trump, Facebook has done little publicly to signal how it would like the board to rule.
In some respects, Nick Clegg was right. Facebook should never have had this responsibility. But the problem goes far beyond Facebook dumping some important decision-making on an outside body of dignitaries. For years Facebook traded everything for scale as it focused on connecting vast populations, all with little regard for what happens next (Here’s a militia group you may be interested in!). It created a monster, declared it the public sphere, and made an obscene amount of money for it. Forget containing Trump: How do you fix such a thing without killing the monster?