Insights and Impact

3 Minutes on Conspiracy Theories 

Thomas Costello, psychology professor and cocreator of DebunkBot*, on the lure of misinformation—and how AI can help curb its spread

Professor Thomas Costello

Conspiracy theories are just descriptive claims about the world. If someone contends that 9/11 wasn’t a terrorist attack perpetrated by al-Qaeda, but rather, it was organized by the US government, they think there’s evidence supporting it.

Most people don’t test every hypothesis themselves—they rely on experts and trusted sources, so if they’re exposed to information that is wrong but seems plausible, like a conspiracy theory, they might believe it—especially if it resonates with their beliefs. People who don’t trust elites, government, and institutions are particularly prone to conspiratorial beliefs.

People can also get psychological value from believing in conspiracy theories. If someone is afraid and believes that the world is dangerous, random, and chaotic, it’s almost a comforting idea that there’s order in the world—even if that order is an evil secret government.

Another aspect of conspiratorial thinking is group allegiance. If members of the group to which a person belongs share a belief, they’re likely to subscribe to it.

Bad beliefs, misinformation, and polarization have always been problems. The democratization of information that came with the internet, where people are not getting facts from the same sources, is a newer problem—or at least the scale is new. Many people were optimistic that the internet would enable humans to do their own research and come to their own good conclusions, but that sort of epistemic work may not be in our nature, and the internet isn’t devised to stop users from drowning in random people’s opinions.

Another issue is partial truths. For example, a lot of fake news about COVID-19 vaccinations during the pandemic contained unequivocally false information. It was harmful, but it wasn’t terribly widespread.

More impactful were real news stories that were true but ultimately misleading, like one in the Chicago Tribune with the headline “A ‘Healthy’ Doctor Died Two Weeks After Getting a COVID-19 Vaccine; CDC Is Investigating Why.” A doctor did in fact die after getting the vaccine—but probably not because of it. The story went viral, and millions of people saw it and changed their beliefs a little.

Human beings are not always thoughtful, don’t always stress-test beliefs against plausible counterarguments, and don’t necessarily try to see the other side’s perspective. Fractionating into polarized groups only amplifies this. AI—which can get around the world just as quickly as a lie (or a half-truth)—can begin to act as an effective counter.

The “BS asymmetry principle” argues that it’s easier to spread lies than to debunk them. AI is a tool for combating that because it’s able to say in real time, “Here’s why you’re wrong,” or, “This isn’t trustworthy information.” People don’t need to do the human cognitive labor to combat the misinformation themselves. This is an optimistic version of the future where AI helps people think more clearly.

There are also pessimistic versions predicting that AI will spread misinformation more quickly than AI can stop it. It’s hard to know what will happen, but think of using AI as a form of epistemic hygiene. If people hear a wacky claim, they can check with ChatGPT and see what it thinks. People don’t have to trust it completely—they can just use it to get another perspective.

*DebunkBot is an artificial intelligence bot that engages in personalized, evidence-based dialogue with users to dismantle conspiratorial beliefs. It was featured on the September 13, 2024, cover of Science.

Conspiracy theories at a glance:

  • DebunkBot engaged 2,000+ conspiracy believers for an average of 8.4 minutes
  • DebunkBot users reduced their conspiratorial beliefs by 20%
  • 1 in 4 DebunkBot users became actively skeptical of their conspiratorial belief

  • 54% of Americans believe Lee Harvey Oswald didn’t act alone in assassinating JFK
  • 73% of Americans believe that conspiracy theories are “out of control”
  • Kids ages 13–17 are more likely than adults to believe online conspiracies