Skeptoid PodcastSkeptoid on Facebook   Skeptoid on Twitter   Skeptoid on Spotify   iTunes   Google Play

Members Portal

Support Us Store

 

Free Book

 

How to Change a Conspiracy Theorist's Mind

Donate A roundup of the ways that work — and that don't work — to help a conspiracy theorist free himself from the rabbit hole.  

by Brian Dunning

Filed under Conspiracy Theories, Logic & Persuasion

Skeptoid Podcast #891
July 4, 2023
Podcast transcript | Subscribe

Listen on Apple Podcasts Listen on Spotify

Share Tweet Reddit

How to Change a Conspiracy Theorist's Mind

Make no mistake about it: the harboring of conspiracy theories is harmful to the believer. The louder and prouder a conspiracy theorist is about their beliefs, the greater the harm to themselves. It can cost them friends; it can sow rifts inside of families; it can harm them professionally; it can cause anxieties and stresses that result in physical harm like hypertension, sleep loss, eating disorders, psychological problems, and a myriad of other disorders associated with chronic stress. And conspiracy theories spread like viruses, they nearly always spread to some of the people the original believer expresses them to. The cost to society of conspiracy ideation is a real one, and at a personal level, it is one that impacts most of us to some degree. We all know someone who is a conspiracy theorist, and we've all pondered the question of how to help them move away from this form of self-harm.

This is, in fact, probably the single most common email question I've received over the course of the many years I've been hosting Skeptoid. My mom, my friend, my coworker, my cousin has gone down the rabbit hole of conspiracy ideation; how can I help them move away from this? Many of us have tried this at one time or another, to try and counter-argue specific points with a conspiracy theorist. To debunk, to show facts in a book, to try and counter it any way we could think of. And how successful have we been? Not very.

It turns out there is a good reason for this. Ever since the sharp worldwide rise in populism during the decade centered on about 2010 — a political garden which is nutritious soil indeed for conspiratorial tendencies — license has been given for people to proudly assert their conspiracy theories writ large. And consequently more and more academic attention has turned to finding cures for this intellectual and cultural cancer. We've studied ways that you might try to directly confront a belief. We've studied ways to prophylactically prevent the tendency from taking root. And, unfortunately, the overall outlook is gloomy. For every one method that sort of works, we find many methods that don't work at all or actually make the problem worse.

The task of countering conspiracy ideation is one of learning to change our own minds about things we believe in our hearts to be true, because conspiracy ideation is something that cuts nearly equally across all demographics — meaning, that includes ourselves. There is no demographic group that is significantly more or less susceptible. We all believe our own group is the one immune to such flaws, but the data shows that we're wrong about that. The truly wise and insightful researcher knows that we all believe things that aren't true, so we should be willing to try anything to correct that.

So without further ado, let's dive into the literature. What follows is a synthesis of some dozen or so academic papers studying what has worked and what hasn't, and we'll take them in reverse order.

Methods that totally don't work

  • Labeling. Calling someone's idea a "conspiracy theory" instead of just an "idea" might seem like it would make people second guess it, but it doesn't. They figure you're being a dismissive jerk and they double down harder.

  • Rationality priming. If you open the conversation by priming the person to reflect on their level of rationality, for example by asking "Do you consider yourself a rational person?" (a question to which everyone answers yes), they tend to still remain just as attached to their conspiracy theory.

  • Empathetic arguments. Calling attention to the dangers of scapegoating or stigmatizing those cast as the villain in the conspiracy — whether it's Jews, the government, the church, the Koch Brothers — tends to have no impact in moving people away from such beliefs.

  • Prevention focus. Training people to be on the lookout for harmful influences, such as beliefs like conspiracy theories, actually has the reverse effect: causing them to see more conspiracy theories.

  • Ridiculing. You might expect that ridiculing conspiracy believers would cause them to double down. In fact, when tested, the effect was positive in reducing belief, but only to such a small degree that it's barely significant.

  • Rational arguments. Similarly, making rational counter arguments to a conspiracy theory has a positive effect, but only a very small one. Probably not worth the effort.

Methods that don't really work

  • Debunking. When you address the conspiracy's points one-by-one and point out the proofs that each is false, although you're using facts that leave no room for the conspiracy to be true, it still has no meaningful effect. Your points are considered to be disinformation and that just serves to prove the conspiracy theory is true. The effect in studies is still positive, but small.

  • Promotion focus. Training people to concentrate on getting ahead and making better life decisions tends to result in people having a greater sense of control, which in turn makes them less dependent on the need to seek a feeling of control in conspiracy theories. Positive effect on reducing conspiratorial tendencies, but small.

  • Analytical priming. This interesting technique primes people with a simple task, such as solving a word problem designed to engage analytical thinking, right before exposing them to a conspiracy theory. It does confer some amount of resistance to adopting the conspiracy theory slightly better than the control group which got no such task. However this isn't really something you can practically employ in daily life.

  • Priming resistance to persuasion. This involves having people fill out a questionnaire with questions like "I usually do not change what I think after a discussion" to prime them into thinking about how easily they can be persuaded, and then exposing them to a conspiracy theory. Individuals so primed do indeed show a greater resistance to believing the conspiracy theory, but again, it's something difficult to put into practical use.

Methods that work best

The methods for fighting conspiracy theory belief with the highest chances of success all require that they be done before the person learns about and adopts the conspiracy theory. Obviously, in many cases, it will be too late. But if we're looking for general practices to employ, teaching these things early is what has the best outcome.

  • Inoculate with facts and logic about a specific conspiracy theory. People who have been taught in advance about the true history behind a given conspiracy theory, like the Apollo landings or the Holocaust or the 9/11 investigation, are least likely to believe the conspiracy theory when they later hear it. In addition, teaching how and why an alternate hypothesis would be illogical confers strong resistance. Doing either or both of these before the person is exposed to the conspiracy theory gives the best chance of success. Also known as "prebunking".

  • Give classes on general pseudoscience. These are the core lessons we teach here at Skeptoid. Carl Sagan's baloney detection kit. The fundamental flaws in the Big Pharma or Big Agriculture conspiracies. The telltale signs of a crackpot. People fluent in these basic techniques become far less likely to buy into conspiracy theories.

All of this leaves us with a rather sobering conclusion, which is — unfortunately — that if someone is already down the rabbit hole, research shows there's very little you can do. We have to teach these skills, and celebrate scientific skepticism and critical thinking, at a young age before the conspiracy poison spreads. However, none of this is absolute. This data only shows the trends, and there are exceptions. For my whole career I've gotten emails saying something like "Thank you very much for Skeptoid, I used to believe in all the conspiracy theories, and now you've opened my eyes and I'm a skeptic." Of course that happens. The data tells us it happens very rarely, but it does happen. So it's worth asking how did it happen?

Well it could have happened via any of the methods we've discussed that don't work so well. All of them work some of the time; it's just that they have a better chance of getting that person to double down on their beliefs more than they do of correcting them. It is with this knowledge in hand that I always answer the very frequent request I get, and which we talked about at the top of the show: "Hey Brian, my mom/friend/brother/dentist believes in Weird Thing X; do you have an episode I can play them to get them to change their mind?"

We now know that that would be just about the worst possible thing you could do. There's almost no potential for changing their mind; and there's very high potential for alienating them. So when someone asks me to talk to someone I tell them no, and when they ask me what they should say I tell them not to say anything; but when they ask for the Skeptoid episode they can play, I do have a better answer.

Longtime Skeptoid listeners — and I mean very longtime listeners — will recall that I gave the following advice way back in 2010, in episode #187, "Emergency Handbook: What to Do When a Friend Loves Woo". That advice is to not play them an episode that directly challenges the belief in question. Instead, play them an episode about something where you already know that you both agree. Or an episode about something that's not an emotional hotspot in today's divided culture. Play them the episode where we evaluate the competing claims for who was the first to climb Mt. Everest. Or where we assess the legend that the buried tomb of a Chinese emperor still contains a map of the entire world with its rivers and oceans flowing with liquid mercury. Or the one where we listen to known sounds in nature compared to recordings claimed to be Bigfoot vocalizations. Every one of those episodes — and hundreds more like them — convey (I hope) the joy of solving a mystery by rationally weighing the evidence. It's the process that's important, not the conclusion. The process is what those conspiracy theory interventions that actually work are intended to focus on.

If you can instill in your friend an enthusiasm for the process of learning what's real, then you've just helped to create a person more likely to begin questioning their own beliefs that obviously don't conform to the standard of evidence. You can't change their mind on something where it's already made up, but they can.

So that's always been my answer — and as we see it's essentially the same as the interventions most likely to succeed. The thing I'm adding is that if you teach the process after they've already gone all-in on a conspiracy theory, avoid that topic. Pick any other, and teach the same process.


By Brian Dunning

Please contact us with any corrections or feedback.

 

Shop apparel, books, & closeouts

Share Tweet Reddit

Cite this article:
Dunning, B. "How to Change a Conspiracy Theorist's Mind." Skeptoid Podcast. Skeptoid Media, 4 Jul 2023. Web. 1 May 2024. <https://skeptoid.com/episodes/4891>

 

References & Further Reading

Bonetto, E., Troïan, J., Varet, F., Monaco, G.L., Girandola, F. "Priming Resistance to Persuasion decreases adherence to Conspiracy Theories." Social Influence. 9 May 2018, 10.1080/15534510.2018.1471415: 1-12.

Jolley, D., Douglas, K. "Prevention is better than cure: Addressing anti-vaccine conspiracy theories." Journal of Applied Social Psychology. 14 Mar. 2017, 2017;00: 1-11.

O'Mahony, C., Brassil, M., Murphy, G., Linehan, C. "The efficacy of interventions in reducing belief in conspiracy theories: A systematic review." PLOS One. 5 Apr. 2023, 10.1371/journal.pone.0280902: 1-18.

Orosz, G., Krekó, P., Paskuj, B., Tóth-Király, I., Böthe, B., Roland-Lévy, C. "Changing conspiracy beliefs through rationality and ridiculing." Frontiers in Psychology. 13 Oct. 2016, 10.3389/fpsyg.2016.01525: 1-9.

Stojanov, A., Bering, J., Halberstadt, J. "Does Perceived Lack of Control Lead to Conspiracy Theory Beliefs? Findings from an online MTurk sample." PLOS One. 17 Aug. 2020, 10.1371/journal.pone.0237771: 1-18.

Swami, V., Voracek, M., Stieger, S., Tran, U., Furnham, A. "Analytic thinking reduces belief in conspiracy theories." Cognition. 8 Aug. 2014, 10.1016/j.cognition.2014.08.006: 572-585.

Whitson, J., Kim, J., Wang, C., Menon, T., Webster, B. "Regulatory Focus and Conspiratorial Perceptions: The Importance of Personal Control." Personality and Social Psychology Bulletin. 7 Apr. 2018, 10.1177/0146167218775070: 1-13.

 

©2024 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information

 

 

 

Donate

 

 


Shop: Apparel, books, closeouts

 

 

Now Trending...

Exploring Kincaid's Cave

Tartaria and the Mud Flood

The Siberian Hell Sounds

The Red Haired Giants of Lovelock Cave

Refeeding Syndrome and Sudden Death

How to Extract Adrenochrome from Children

How to Spot Misinformation

Falling into Mel's Hole

 

Want more great stuff like this?

Let us email you a link to each week's new episode. Cancel at any time: