A fact-check solution to a constitutional crisis.
Facebook’s Star Chamber
Silicon Valley’s cynical grasp on the levers of speech threatens democratic norms.
The Facebook Oversight Board, Facebook’s arbiter of speech, recently upheld the platform’s ban on former president Donald Trump. While the Board agreed with Facebook that Trump had violated the company’s rules, it called the indefinite nature of the suspension “vague,” “standardless,” and “arbitrary,” directing the company to revisit the decision within six months.
Facebook’s own Oversight Board appears to have been confounded by what its users have known for years: the company makes up its content moderation as it goes to suit its politics and preferences with little regard for objectivity or America’s robust traditions of free speech.
Ultimately, however, the Board’s decision is a distraction from the larger issue at hand, which is Facebook’s belief, reiterated by the Oversight Board, that it is more powerful than the speech of democratically elected leaders, highlighted by the fact that it does not see such political speech as distinct. Indeed, the Board’s opinion noted “it is not always useful to draw a firm distinction between political leaders and other influential users, recognizing that other users with large audiences can also contribute to serious risks of harm.”
In other words, whenever Facebook—not elected legislatures, not voters, not leaders themselves—decide that a country’s politics is getting too spicy, Facebook can simply wipe out the ability of that leader to speak to the 3 billion monthly users on its platform, including the millions of users who may be deciding how to vote.
Some do not see this as consequential. Robby Soave of the libertarian website Reason called the reaction of many to Facebook’s decision a “meltdown” from people who need “a reality check.” Facebook is largely harmless, he contends, because the company cannot drop bombs on you or put you in jail or make you wear a mask. “The only thing Facebook can do is stop people from posting on Facebook,” he notes.
If only. Facebook cannot jail you, though if you’re unlucky enough to find yourself there, Facebook may decide your guilt or innocence for you. And the platform will gladly ban speech that contradicts or questions the government’s official narrative on Covid-19, including banning users for questioning the efficacy of mask wearing.
Far from being “a website for sharing pictures,” Facebook is now one of the world’s largest digital ad agencies. With its 3 billion monthly users, it is a central platform for the flow of news around the world and a hub for conducting business. The centrality of Facebook to disseminating information was evident in February when Facebook shut down the news sharing capabilities in the country of Australia in retaliation for the country passing a law it didn’t like. Posts from the government, medical facilities posting information about Covid-19, and key news bulletins about the bush fires raging in the country vanished.
Could citizens looking for information still go to government websites and individual news sites to find it? Of course. But for millions, the organization of information and community flows around Facebook. Small towns across America use the platform to livestream school board meetings and community functions. Churches use Facebook to organize and share information. So do millions of small businesses for which the platform is an access point to the marketplace.
Facebook’s platform and digital advertising also plays a critical role in political speech and elections. As Facebook’s COO Andy Bosworth put it in a leaked memo in 2020, “Is Facebook responsible for Donald Trump getting elected? I think the answer is yes, but . . . he didn’t get elected because of Russia or misinformation or Cambridge Analytica,” he wrote. “He got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser. Period.”
How Facebook manages speech, then, is far more about being able to “post on Facebook.” Under our noses, the company has made itself essential to commerce, communication, organization, and elections. Facebook’s thumb on the scale in any direction tilts the advantages political candidates have with both advertising and reach, constrains or amplifies access to the marketplace, and modifies how individuals receive information vital to their democratic decision making.
But rather than viewing itself this way—as a critical component of how countries speak, organize, and transact—it is clear Facebook sees itself less as a facilitator of speech than it does an arbiter. It places itself in an unaccountable position between voters and the speech of candidates they elect, and above the laws passed by democratically elected legislatures. A core problem the company poses is its size. If Facebook weren’t so large, its restrictive policies wouldn’t matter so much; but size and the accompanying “network effect” are key to Facebook’s market dominance. The question of size thus becomes something that the public will have to address sooner rather than later, when Facebook has become too big to master.
The dispute over Big Tech often confronts us in individual spats about moderation and access. But what it really represents is a question of control. In free societies, the people should have final say over access to the public square. But unelected Silicon Valley plutocrats now exercise undue leverage over our digital marketplace of ideas.
The American Mind presents a range of perspectives. Views are writers’ own and do not necessarily represent those of The Claremont Institute.
The American Mind is a publication of the Claremont Institute, a non-profit 501(c)(3) organization, dedicated to restoring the principles of the American Founding to their rightful, preeminent authority in our national life. Interested in supporting our work? Gifts to the Claremont Institute are tax-deductible.
A fresh lesson in the un-American cost of chilling online political speech.
Partisan platforms made us a captive audience. Now there’s a way out of the cave.