The decision last week by Meta, the parent company of Facebook, Instagram, and Threads, to immediately stop using “fact checkers” — groups hired by Meta to determine what information is true and what is false, and thuswhat should be removed — represents not just a return to common sense but also good news for both science and democracy.
Of course, I have an interest in Meta’s decision, having been one of the experts whose writings were regularly prohibited from being shared on Facebook based on its content moderation system. That system includes both fact-checkers and automated filtering, which is also being recalibrated to allow more speech.
I am a professor and researcher who studies and writes on contested topics, including climate change, transgender athletes, and the origins of COVID-19, at that messy place where science meets politics.
My work on climate change is what Facebook routinely deleted. I am a curious candidate for such aggressive moderation. I have studied climate change science and policy for more than 30 years and published extensively in peer-reviewed literature. My research is widely referenced by the Intergovernmental Panel on Climate Change, a highly respected United Nations body that periodically assesses science related to climate change.
At the same time, I have also taken heat from climate activists for calling out their frequent overheated claims about disasters that go well beyond what science can support. Perhaps my greatest sin is that I have consistently argued that while climate change is real and serious, there are other issues that matter for climate policy as well, including energy costs, security, and access.
These views, and my support for including fossil fuel interests (representing more than 80 percent of global and US energy use) in climate policy discussions, may have put my writings on the moderated list. Of course, it is possible that my work was among the 20 percent of removed content that Meta now admits was mistakenly taken down. No one at Meta has ever told me why.
Regardless of the reasons, my writings were a tiny part of the millions of pieces of content that Meta removed every day, according to its announcement last week.
I can see the political appeal of removing unwelcome or uncomfortable views from the public sphere. And to be fair, there are some truly abhorrent and egregious false claims that make their way into public discourse, easily amplified and spread by today’s social media.
Meta learned, however, that even well-intentioned efforts to police extreme views were readily co-opted by its fact-checkers to silence political views that they did not welcome.
“Experts, like everyone else, have their own biases and perspectives,” Joel Kaplan, Meta’s chief global affairs officer, explained. “This showed up in the choices some made about what to fact check and how. Over time we ended up with too much content being fact checked that people would understand to be legitimate political speech and debate.”
“A program intended to inform too often became a tool to censor,” he acknowledged.
Meta appears to have come to appreciate that science and democracy share an important characteristic: Both are improved by self-correction over time. In both cases, enfranchisement of participation is a feature, not a flaw.
More than 50 years ago, sociologist Robert K. Merton articulated central norms of science. Among them: “The substantive findings of science are a product of social collaboration.”
Such collaboration necessarily involves nonspecialists among the public, as well as credentialed experts. Science works because claims are subject to critique, verification, replication, and overturning.
Everyone is free to contest scientific claims, even as we rely on experts to develop authoritative and legitimate scientific understandings. Albert Einstein famously worked as a patent evaluator when he first published his pathbreaking work on special relativity.
That scientists routinely disagree and contest one another’s work out in the open helps give us confidence that the collective knowledge that results is trustworthy and today’s best possible approximation to truth — while recognizing that further work and debate could overturn current understandings.
Similarly, democracy — however implemented in its particulars — in principle gives everyone an opportunity to participate in politics, thereby having a stake in the process. When voices are prevented from participating, or some have disproportionate privilege, we see, in the long history of US democracy, efforts to correct such situations by expanding participation.
Meta’s efforts to police discourse reflect a fundamental paradox of both science and democracy. Developing robust, legitimate understandings and outcomes requires that we open up space for broad participation, which opens the door to those who are manifestly incorrect, even malevolent.
Efforts by experts or other authorities to police discourse can undermine the effectiveness and the legitimacy of both science and democracy.
Science and democracy are robust enough to withstand misinformation and bad actors — if, and it is a big if — we are collectively committed to engaging in that difficult, often uncomfortable process of working together toward truths and shared interests.
Along the way, we will find lots to disagree with one another about. But if we are going to live together, we must begin with a shared commitment to try.
Roger Pielke Jr. is a senior fellow at the American Enterprise Institute and a professor emeritus in the College of Arts and Sciences at the University of Colorado Boulder. This column was adapted from a post on his Substack newsletter, “The Honest Broker.”