POV: Why Fake Conspiracies Persist, Regardless of the Evidence against Them
Psychology offers some clues to how they arise and thrive
In the 1980s, Satan was everywhere. Satan worshippers ran daycares and forced children to do heinous acts. They flew through the air and killed babies in ritual sacrifices. Most of these fantastical claims came from children, eager to please the adults asking leading questions. Investigations were conducted in multiple cities across the United States and Canada, and although no physical evidence of the crimes was found, people were convicted and sent to prison.
What came to be called the Satanic Panic offers a useful parallel to the conspiracies we see today. QAnon adherents claim that cannibalistic Satanists are abusing and murdering thousands of children a day. A presidential election was “stolen” by tens of millions of fraudulent votes. The entire medical community is lying and injecting “harmful” vaccines into children’s arms. And like the Satanic Panic, contemporary conspiracies persist regardless of the evidence against them. How is this possible? Aren’t humans generally reasonable?
Psychology offers some clues to how impossible conspiracies arise and thrive. But first, I’ll state the obvious. Conspiracies are social phenomena and occur in a historical context. It would be easier to manage if the conspiracy-minded were limited to people with particular traits or tendencies. Some evidence suggests that people who perform better on standard reasoning tasks (reflective versus intuitive thinking) are better at identifying fake news headlines. However, individual differences cannot explain why 70 percent of Republicans believe that the 2020 election was “stolen.” A broader context is needed to understand the scale of these beliefs.
A key historical change from the 1980s is a broader distrust of institutions. The Satanic Panic died out because the courts found no evidence of widespread occult crimes and the media, eventually, stopped covering the rumors. People trusted the courts and the media enough to change their beliefs. But today, virtually all institutions are eyed with suspicion. And to be fair, some of this distrust is warranted, because real conspiracies have been uncovered. Government efforts to disenfranchise Black American voters have existed since the start of Reconstruction. The medical establishment lied to, and failed to treat, Black syphilis patients for 40 years in a notorious “study” that ended only in 1972. Doctors ignored the very real disease ravaging the gay community during the HIV/AIDS crisis in the 1980s. More recently, the Catholic Church covered up an incredible scale of child sexual abuse. It is not surprising that Americans are suspicious of institutions.
The current media ecology represents another major historical shift. Now anyone has the potential to start a viral rumor or pass on false information to a wide social network. And because conspiracies depend on social transmission, social media provides the oxygen that conspiracies need to thrive.
Although distrust of institutions and ability to spread falsehoods widely provide necessary supports for modern conspiracies, they do not explain why we are willing to believe in implausible or impossible scenarios. For that, we need to consider two psychological processes that make seemingly rational people believe irrational things.
The first is that we are surprisingly susceptible to misinformation. Beliefs and memories can be created or changed by leading questions and social interactions after events. In one classic demonstration, undergraduates who initially did not believe in demonic possession found it to be more plausible after reading three short articles describing possession as real. When interviewed by a researcher who suggested that they may have witnessed demonic possession in childhood, 18 percent of the undergrads agreed that they probably had. Repeated exposure to fake news about something impossible makes us more willing to believe it.
Even when misinformation is corrected, it can be difficult to remove completely. In studies designed to debunk misinformation about the flu vaccine, participants were initially able to differentiate facts from myths. But after a few hours, they insisted that many of the myths were actually true. In other words, simply reading about falsehoods, even while noting that they are false, can make misinformation seem more true.
The second process occurs once a group of people has adopted a set of beliefs based on misinformation. Frighteningly, implausible beliefs can persist even in the face of undeniable facts. A classic study in social psychology provides a relevant example. In 1954, a cult called the Seekers predicted that the world would end with a giant flood on December 21, and the cult members would be rescued by a UFO. Three researchers, Leon Festinger, Henry Riecken, and Stanley Schachter, joined the group in order to observe how they responded when the prophecy did not come true. As hours passed after the predicted time, the group was stunned—they had sacrificed so much for this vision of the world. But rather than abandon their beliefs, the failed prophecy was soon reinterpreted. The aliens rewarded the group by saving the world, and they now needed to share this news with everyone.
This case study, documented in the 1956 book When Prophecy Fails, written by Festinger, Riecken, and Schachter, became a famous example of cognitive dissonance, Festinger’s major theory. In rough form, psychological discomfort occurs when existing knowledge or beliefs are challenged by clear, contradictory evidence. Similar reactions to disconfirming evidence have been found when researchers create doubt about particular religious beliefs and views on animal testing. And we are currently witnessing a natural experiment with QAnon believers who were certain that Joe Biden would never be inaugurated. Now that he has, some have abandoned this conspiracy, but some core of true believers will accept whatever reinterpretation “Q” proposes.
Given what we know about the psychology of beliefs and a social ecosystem of rampant misinformation, it seems unlikely that conspiracies will go away anytime soon. And for many of us, this issue is personal—family members or friends believe things that seem unbelievable. But there are signs of hope. The psychologist Adam Grant recently described his attempts to convince a dear friend to vaccinate his children. Appeals to logic and evidence did not work, but new research suggests a more patient approach: listen and genuinely try to understand what your friend believes and search for common ground.
Psychology may never find a way to change the minds of a group of believers. Instead, we may need to do the harder work of talking to one human at a time.
“POV” is an opinion page that provides timely commentaries from students, faculty, and staff on a variety of issues: on-campus, local, state, national, or international. Anyone interested in submitting a piece, which should be about 700 words long, should contact John O’Rourke at firstname.lastname@example.org. BU Today reserves the right to reject or edit submissions. The views expressed are solely those of the author and are not intended to represent the views of Boston University.