How to Change a Conspiracy Theorist's MindSkeptoid Podcast #891 by Brian Dunning Make no mistake about it: the harboring of conspiracy theories is harmful to the believer. The louder and prouder a conspiracy theorist is about their beliefs, the greater the harm to themselves. It can cost them friends; it can sow rifts inside of families; it can harm them professionally; it can cause anxieties and stresses that result in physical harm like hypertension, sleep loss, eating disorders, psychological problems, and a myriad of other disorders associated with chronic stress. And conspiracy theories spread like viruses, they nearly always spread to some of the people the original believer expresses them to. The cost to society of conspiracy ideation is a real one, and at a personal level, it is one that impacts most of us to some degree. We all know someone who is a conspiracy theorist, and we've all pondered the question of how to help them move away from this form of self-harm. This is, in fact, probably the single most common email question I've received over the course of the many years I've been hosting Skeptoid. My mom, my friend, my coworker, my cousin has gone down the rabbit hole of conspiracy ideation; how can I help them move away from this? Many of us have tried this at one time or another, to try and counter-argue specific points with a conspiracy theorist. To debunk, to show facts in a book, to try and counter it any way we could think of. And how successful have we been? Not very. It turns out there is a good reason for this. Ever since the sharp worldwide rise in populism during the decade centered on about 2010 — a political garden which is nutritious soil indeed for conspiratorial tendencies — license has been given for people to proudly assert their conspiracy theories writ large. And consequently more and more academic attention has turned to finding cures for this intellectual and cultural cancer. We've studied ways that you might try to directly confront a belief. We've studied ways to prophylactically prevent the tendency from taking root. And, unfortunately, the overall outlook is gloomy. For every one method that sort of works, we find many methods that don't work at all or actually make the problem worse. The task of countering conspiracy ideation is one of learning to change our own minds about things we believe in our hearts to be true, because conspiracy ideation is something that cuts nearly equally across all demographics — meaning, that includes ourselves. There is no demographic group that is significantly more or less susceptible. We all believe our own group is the one immune to such flaws, but the data show that we're wrong about that. The truly wise and insightful researcher knows that we all believe things that aren't true, so we should be willing to try anything to correct that. So without further ado, let's dive into the literature. What follows is a synthesis of some dozen or so academic papers studying what has worked and what hasn't, and we'll take them in reverse order. Methods that totally don't work
Methods that don't really work
Methods that work bestThe methods for fighting conspiracy theory belief with the highest chances of success all require that they be done before the person learns about and adopts the conspiracy theory. Obviously, in many cases, it will be too late. But if we're looking for general practices to employ, teaching these things early is what has the best outcome.
All of this leaves us with a rather sobering conclusion, which is — unfortunately — that if someone is already down the rabbit hole, research shows there's very little you can do. We have to teach these skills, and celebrate scientific skepticism and critical thinking, at a young age before the conspiracy poison spreads. However, none of this is absolute. This data only shows the trends, and there are exceptions. For my whole career I've gotten emails saying something like "Thank you very much for Skeptoid, I used to believe in all the conspiracy theories, and now you've opened my eyes and I'm a skeptic." Of course that happens. The data tells us it happens very rarely, but it does happen. So it's worth asking how did it happen? Well it could have happened via any of the methods we've discussed that don't work so well. All of them work some of the time; it's just that they have a better chance of getting that person to double down on their beliefs more than they do of correcting them. It is with this knowledge in hand that I always answer the very frequent request I get, and which we talked about at the top of the show: "Hey Brian, my mom/friend/brother/dentist believes in Weird Thing X; do you have an episode I can play them to get them to change their mind?" We now know that that would be just about the worst possible thing you could do. There's almost no potential for changing their mind; and there's very high potential for alienating them. So when someone asks me to talk to someone I tell them no, and when they ask me what they should say I tell them not to say anything; but when they ask for the Skeptoid episode they can play, I do have a better answer. Longtime Skeptoid listeners — and I mean very longtime listeners — will recall that I gave the following advice way back in 2010, in episode #187, "Emergency Handbook: What to Do When a Friend Loves Woo". That advice is to not play them an episode that directly challenges the belief in question. Instead, play them an episode about something where you already know that you both agree. Or an episode about something that's not an emotional hotspot in today's divided culture. Play them the episode where we evaluate the competing claims for who was the first to climb Mt. Everest. Or where we assess the legend that the buried tomb of a Chinese emperor still contains a map of the entire world with its rivers and oceans flowing with liquid mercury. Or the one where we listen to known sounds in nature compared to recordings claimed to be Bigfoot vocalizations. Every one of those episodes — and hundreds more like them — convey (I hope) the joy of solving a mystery by rationally weighing the evidence. It's the process that's important, not the conclusion. The process is what those conspiracy theory interventions that actually work are intended to focus on. If you can instill in your friend an enthusiasm for the process of learning what's real, then you've just helped to create a person more likely to begin questioning their own beliefs that obviously don't conform to the standard of evidence. You can't change their mind on something where it's already made up, but they can. So that's always been my answer — and as we see it's essentially the same as the interventions most likely to succeed. The thing I'm adding is that if you teach the process after they've already gone all-in on a conspiracy theory, avoid that topic. Pick any other, and teach the same process.
Cite this article:
©2024 Skeptoid Media, Inc. All Rights Reserved. |