Advertisement

Op-Ed: Banning Trump from Facebook may feel good. Here’s why it might be wrong

Facebook CEO Mark Zuckerberg.
Facebook’s Mark Zuckerberg last year created an oversight board to review the company’s decisions to remove controversial content and accounts from its platform.
(Trent Nelson / Salt Lake Tribune)
Share via

Some call it the Supreme Court of Facebook. Last year Facebook established its Oversight Board, a global group of 40 legal and policy experts, with the power to issue binding decisions on whether to reverse Facebook’s removal of controversial content.

Eager to fend off government regulation of his empire, Mark Zuckerberg, Facebook’s CEO, created the board to demonstrate that the platform could police itself. By delegating fraught decisions that balance free speech and public safety concerns to impartial experts, he hoped to show his commitment to the public good and not just to private profit.

On Thursday, the board took up its biggest case so far: reviewing Facebook’s decision to ban former President Trump from the platform indefinitely after the Capitol Hill insurrection. To many, the abrupt end to four years of all-caps baiting, berating and bluster by Trump came as a soothing relief. But the silencing of a president on his preferred platforms is also a testament to the staggering influence of Facebook and a few other media companies over public discourse.

Advertisement

The board’s review of the Trump case will test the promise of enlightened self-regulation. A transparent decision that is grounded in human rights principles and generalizable globally may help assuage fears of a corporate stranglehold over the marketplace of ideas. And that could reduce pressure for legislative intervention — like repealing the legal shield digital platforms currently have from liability for user-posted content.

No one disputes that Facebook is free to shut accounts; the 1st Amendment protects individuals from restrictions on speech by the government, not by private companies. Facebook maintains an extensive, evolving set of “community standards” governing speech. It prohibits threats, scams, bullying and other types of offensive content.

Over time, though, the company has bent its rules, sometimes to accommodate Trump. It left up the president’s declaration that “when the looting starts, the shooting starts” during last summer’s racial justice protests, even though those comments probably violated Facebook’s rules. When criticized for allowing inflammatory posts, Zuckerberg has often justified his decisions on free speech grounds. But some critics have noted more self-serving motives: keeping Trump on the platform allowed Facebook to continue to sell ads to his 33 million followers.

Advertisement

After more than four years of resisting demands to boot Trump, Zuckerberg announced after the Capitol Hill rampage, “We believe the risks of allowing the President to continue to use our service during this period are simply too great.” He cited no specific rule-breaking posts. The absence of a clearly articulated justification for Trump’s deplatforming prompted debate over whether the risks cited were to public safety and democracy or simply to Facebook’s bottom line. The ouster helped deflect, perhaps temporarily, accusations that Facebook had played a role in fomenting the violent assault through its algorithmic propulsion of lies and conspiracy theories.

Trump’s removal has intensified fears that Silicon Valley exercises too much control over speech and public discourse. Russian opposition leader Alexei Navalny, now jailed in Moscow, expressed alarm that Trump’s silencing “will be exploited by the enemies of freedom of speech” around the world. “Every time when they need to silence someone, they will say: ‘This is just common practice, even Trump got blocked.’” German Chancellor Angela Merkel also voiced concern about the ban being a “problematic” infringement on the “fundamental right” of freedom of opinion.

The Facebook Oversight Board’s decision on the Trump case — expected in late April — will show whether that expulsion can be justified by something other than an impulse to appease angry users and butter up a new administration.

Advertisement

To instill confidence that social media can be self-policed, the board’s deliberations should satisfy three criteria: transparency, adherence to international human rights principles and generalizability. For transparency, the board will have to spell out in detail what arguments and evidence were considered and exactly how it arrived at its decision. It should also make clear whether its reasoning would have applied to calls to bar Trump before Jan. 6, including during the campaign when such a ban would surely have fed claims of political bias.

If the board affirms the Trump ban, it will have to explain how that action conforms with the International Covenant on Civil and Political Rights, which requires that limits on free speech be clearly designated and necessary to achieve a legitimate aim. To satisfy this standard, the board will need to identify Trump’s posts that violated specific community standards. It should also explain why other remedies — such as warnings, algorithmic demotions (making his posts less likely to rise to the top of users’ feeds) and fact-checking corrections — were insufficient. And if Facebook’s aim was to prevent the violent overturning of a democratic election, the board must determine whether the ban should end now that Trump has left office.

The generalizability of the rationale for the Trump ban may pose the biggest challenge. While Trump was unique, his manipulation of social media helped refine a playbook now used by authoritarians around the world. Dissidents have pointed out that a ban on Trump in the name of curbing “domestic terrorism” will play into the hands of autocrats who discredit their political opposition using precisely that argument. The board will need to consider questions of consistency and spell out how its reasoning would apply to other influencers who stoke violence. If Trump’s ban was justified by the Jan. 6 assault, how does that differ from the social media blackouts and other forms of censorship decreed by authoritarians under cover of a national emergency?

By creating the Oversight Board, Facebook is trying to quell alarm over the platform’s expanding power. The board will now have to prove it can answer thorny questions that have divided politicians and the public. And most importantly, it needs to show that it is unafraid of overruling Zuckerberg himself if that’s where the facts and arguments lead.

Suzanne Nossel is chief executive of PEN America and author of “Dare to Speak: Defending Free Speech for All.”

Advertisement