Approaching a Subject Skeptically
One of the questions I get asked a lot is how I go about approaching a new subject. When you hear about something new, what's the best way to think about it? What's the best way to determine whether it's science or pseudoscience? Well, I'm not sure that there is a "best" way, and I don't think there's one methodology that everyone can follow that's going to work in every circumstance, but I'll try to give the best answer I can. It's probably not the same answer you'd hear from others, but this is what works for me.
First of all, and perhaps most important, is that there's a separation between my daily life and working on Skeptoid. I don't walk around demanding peer-reviewed scientific evidence for everything that I see. I don't have a crazed, obsessive drive to know the validity of every new product for sale at the mall. I'd never get through my day without a certain amount of tolerance for pseudoscience. Fad products, marketing campaigns, greenwashing, and even straight-up fraudulent claims surround us, all day, every day. I accept that. Trying to be a full-time challenger of pseudoscience would not only be hopelessly quixotic, it would also annoy everyone around me, and rob me of the freedom to enjoy my day.
So I let virtually everything slide. A coworker is wearing a magic bracelet? Great, good for him. Neighbor talks about her great visit to the reflexologist? Bully for her. Overhear some people discussing what Nostradamus said about the 2012 apocalypse? Whatever floats their boat.
But what if I'm out with friends and somebody asks me my thoughts on something? This happens all the time. You're on the spot, you don't have access to research materials, you don't have time to look into it. Now, oftentimes I've already done an episode about the subject in question, or something really similar, that gives me a pretty good foundation. Sometimes I haven't, and like most people, have to rely on a journeyman's knowledge of a subject area that's outside of my core competence. This provides a pretty good overview of whether or not the new claim is in line with what's generally known about the subject. Usually it's not; otherwise it wouldn't be on the news or wherever it was that my friends heard about it.
So there you are. You're given something that raises your skeptical radar, it's outside your core competence, and your friends just saw it on television or the Internet. Despite the fact that most people say they take TV or Internet reports with a grain of salt, few actually do. There's something deeply compelling about hearing a claim from an authoritative source; we all have a voice in the back of our heads that wants the new claim to be true, and this desire gets confirmed by the belief that the story wouldn't have made it all the way to the TV news without having been pretty well substantiated. What are you going to do?
The first thing I'd do is take out my phone and track down the original source of the story, using keywords from the report to search Google. I'd want to know if it was reported in any journals, or if it skipped this process and went straight to the mass media. This is the simplest and fastest way to see if a new claim or phenomenon has come from the world of legitimate research, or if it comes from a crank, charlatan, or manufacturer operating outside of science. You always have to remember that the mass media doesn't care; they're interested in the sensationalism of the story, not in its validity.
That's it. That's probably all I'm going to do when I'm out in the world and get a question that's worthy of looking into. It's not a perfect process, but nine times out of ten this will correctly tell you whether there's something there, or whether it's just more noise from media clamoring for eyeball share.
It's only when I take my seat in the Skeptoid office that I assume the mantle of proper separator of fact and fiction. This is when I take each week's topic and give it my honest best effort at a good skeptical treatment. The best topics are those that are popularly misunderstood, but with facts behind them that, when properly understood, are way cooler than the popular version. This isn't as hard as it might sound; nearly every popular myth has some history that puts its genesis into a fascinating new perspective.
Sometimes finding this perspective takes me back in time, to an out-of-print book, or to a newspaper article a century old. Tracking these down requires a lot of eBook purchases, Google Books downloads, newspaper archive searches, and occasionally even the coveted trip to a real library to find a real book. Of course, even the relevant pages from the real book end up as electronic files on my computer, photographed with the iPhone and then OCR converted to searchable text. Getting brand new information, like current research, is almost exactly the same process; it's all available when you have the right accounts to access online research libraries. But none of that compares to the few chances to actually go in person to a place where something strange is said to have happened: to smell the dusty desert wind across Death Valley's Racetrack Playa, to touch the cold granite of the Georgia Guidestones, and to photograph a superior mirage such as the ones responsible for so many legendary ghost lights.
I've been doing this show every week for five years now, and on the one hand, you might assume that I've developed a certain aptitude for smelling rats, and have pretty good radar for science vs. pseudoscience. That's true to a degree; but at the same time, I've learned that I can easily be surprised. I often learn that something that sounded pretty hokey is actually true, and something I took for granted turns out to be false. So rather than having developed a supersense for fact and fiction, I've actually picked up a more acute awareness of my own ignorance. Kind of the opposite of what one might hope for; but as we see so often, magically easy solutions to complex problems are a fool's gold.
The process is different every time, but it always starts with a quick survey of the most popular sources, followed by delving deeper into the roots. If it's homeopathy, I want to know what led Samuel Hahnemann to his original conclusions. If it's a conspiracy theory, I want to know who came up with it and what question they were trying to answer. If it's a ghost story, I want to know who first wrote about it and what their relationship was to the hauntee. It's critical to allow for the possibility that the story may or may not be as reported; and to follow up the leads in both directions. Frequently this requires some pretty detailed departure from the popularly known core of the story.
For example, say you find a reference to the mayor of an old town. First you find out if the town actually exists, where it was, whether it's still there, and find it on Google Earth to see if it makes sense within the context of the story. Then find out if the person listed as the mayor actually was the mayor. Find out when he was born, see if the timing is right. There are myriad details you can drill down through, to be as thorough as possible validating the story. Sometimes there are an endless number of these leads, and with only a week between episodes, I often have to simply stop following them, thus making many episodes necessarily incomplete and open to error.
But when you have the time, how far do you go tracking these leads? I've found that there's never a point of diminishing returns. Every time I've made a discovery or connection that (to my knowledge) no other researcher has found, it's always in one of these fine tails of data. The unturned stones are rarely in the middle of the road most traveled. They're in the obscure newspaper article that never got syndicated; they're in the out-of-print interview with the expert who was misquoted in the popular version of the story; and more than anywhere else, they're in the actual published research that was omitted from the mass media reports because it did not support a sensational revisioning of the story.
I don't mean to sound cynical about the mass media. There are many, many excellent reporters and news bureaus who conscientiously produce exceptional material. But I think you'll find that the better they are, the more likely they are to give you an honest assessment of the industry's overall goal, which is to be profitable. The easiest way to do this, as practiced by a probable majority of editors, is to be sensational. I don't think it's a cynical assessment, and it has certainly proven itself to me time and time again through my work validating mass media reports.
So take the road most traveled, as presented in Wikipedia, to get the lay of the land. But to truly learn anything new, you must explore those obscure details that nobody else had time for, or that they overlooked.
Interestingly, I'd say that my process — though it's much more thorough — is probably no more accurate than the quick trick in the restaurant with a smartphone and Google. The more information I collect, the more possibility for error. The more obscure threads I follow, the more are likely to be unreliable. And the more time I spend trying to be thorough on one part of the story, the less time I have for the other parts: an unfortunate exigency of producing a weekly show. I'd say that errors of omission are my most common mistakes, followed by errors that I just didn't catch because of limited time. And like every fallible biological entity, I also make errors by misinterpreting, misreading, and failing to see beyond my own personal biases.
You'll make these same errors in your own research. The best defense against them is to acknowledge your blind spots, compensate for them, and honestly qualify remarks that you can't be sure of. First I try to be right more often than I'm wrong, but second I try to emphasize the process over the conclusions. Being right nine times doesn't guarantee that you'll be right the tenth time, but trying hard all ten times guarantees that you'll at least be as right as your process is capable of.
Cite this article:
©2022 Skeptoid Media, Inc. All Rights Reserved.