Yes, there is a cure for bullshit
Bullshit’s no laughing matter. Climate denialism bullshit, for example, is harmful. Misinformation about SARS-CoV-2 clearly cost lives. In fact, the biologist Carl Bergstrom, while watching the pandemic unfold, argued that “detecting bullshit” should be a top scientific priority. In 2020, Bergstrom coauthored a book called Calling Bullshit: The Art of Skepticism in a Data-Driven World. In their preface, he and his coauthor paid respect to the philosopher Harry Frankfurt, who died on Sunday at the age of 94. Frankfurt, they noted, “recognized that the ubiquity of bullshit is a defining characteristic of our time.”
Frankfurt, the author of the surprise 2005 bestseller On Bullshit, maintained that bullshit isn’t the same thing as a lie. The bullshitter is unaware of the facts. They’re just “bullshitting,” as we say, often in order to persuade others to go along with something, like a plan. But the liar deceives knowing what’s true and obscures it, with language or charts and figures. The good news is that we don’t have to resign ourselves to observing the spread of bullshit—or lies.
In a new study published in Nature Human Behavior, researchers came away optimistic about efforts to combat bullshit about COVID-19, which continues apace. They ran an experiment involving over 34,000 people from 16 countries on six continents to find out whether a set of interventions could help people across a variety of cultures avoid believing and sharing falsities. Turns out, the interventions do help. Reminders to think about accuracy, tips on digital literacy, and effective crowd-sourced accuracy ratings improve the information hygiene of people around the world. The researchers found “striking regularities in both the underlying psychology of misinformation and the effectiveness of interventions to combat it.”
Bullshit isn’t the same thing as a lie.
The participants in the study read 20 headlines related to the pandemic, half of them true, the other half false. In one condition, they rated how accurate the headlines were, and in the other, they gauged how likely it would be for them to share the headline on social media. When the researchers prompted some of the subjects to gauge the accuracy of a headline unrelated to COVID-19 before reading a false COVID-19-related headline, those subjects were less likely to say that they would share it. “Together with a field experiment conducted with mostly users from the United States,” the researchers write, “our findings suggest that [social media] platforms could reduce the spread of certain forms of misinformation in many parts of the world by nudging users to attend to accuracy.”
What sorts of people worldwide tend to believe and spread COVID bullshit, and which sorts of people do not? The researchers found consistent cross-cultural evidence that discerning what’s true is associated with analytic thinking, a motivation for accuracy, and support for democracy. The senior authors on the paper—MIT psychologist David Rand and Cornell University psychologist Gordon Pennycook—had explored this territory before. In their 2019 paper on “bullshit receptivity,” they examined the psychological profiles of people who fall for false stories and found that they tend to be “overly accepting of weak claims” in general.
Of course, bullshit about COVID-19 isn’t the only major kind of misleading information with which we now have to contend. We’re learning to beware the bullshit that fancy chatbots hallucinate. The problem has kept Frankfurt’s On Bullshit—originally and obscurely published as a paper in 1986—“unnervingly relevant,” according to the Financial Times. The philosopher of AI Raphaël Millière saw this coming back in 2020, arguing that, with the advent of GPT-3, we are entering “the next level of bullshit,” where bullshit—as Frankfurt conceived it—is no longer a hallmark of just human speech.
I like to think that the wry philosopher would have been amused that he lived to witness the advent of a new form of consequential bullshit: not just from people indifferent to the truth—but from machines fundamentally incapable of caring about what’s true.
This article originally appeared on Nautilus, a science and culture magazine for curious readers. Sign up for the Nautilus newsletter.