Skeptoid PodcastSkeptoid on Facebook   Skeptoid on Twitter   Skeptoid on Stitcher   iTunes   Google Play

Members Portal



A Magical Journey through the Land of Reasoning Errors

Four common types of analytical errors in reasoning that we all need to beware of.  

by Brian Dunning

Filed under Logic & Persuasion

Skeptoid Podcast #297
February 14, 2012
Podcast transcript | Download | Subscribe



Today we're going to cover a bit of new ground in the basics of critical thinking and critical reasoning. There are several defined types of common analytical errors to which we're all prone; some, perhaps, more so than others. Reasoning errors can be made accidentally, and some can even be made deliberately as a way to influence the acceptance of ideas. We're going to take a close look at the Type I false positive error, the Type II false negative error, the Type III error of answering the wrong question, and finally the dreaded Type IV error of asking the wrong question.

By way of example we'll apply these errors to three hypothetical situations, all of which should be familiar to fans of scientific skepticism:

  1. From the realm of the paranormal, a house is reported to be haunted. The null hypothesis is that there is no ghost, until we find evidence that there is.
  2. The conspiracy theory that the government is building prison camps in which to orderly dispose of millions of law-abiding citizens. The null hypothesis is that there are no such camps, until we find evidence of them.
  3. And from alternative medicine, the claim that vitamins can cure cancer. The null hypothesis is that they don't, unless it can be proven through controlled testing.

So let's begin with:

Type I Error: False Positive

A false positive is failing to believe the truth, or more formally, the rejection of a true null hypothesis — it turns out there's nothing there, but you conclude that there is. In cases where the null hypothesis does turn out to be true, a Type I error incorrectly rejects it in favor of a conclusion that the new claim is true. A Type I error occurs only when the conclusion that's made is faulty, based on either bad evidence, misinterpreted evidence, an error in analysis, or any number of factors.

In the haunted house, Type I errors are those that occur when the house is not, in fact, haunted; but the investigators erroneously find that it is. They may record an unexplained sound and wrongly consider that to be proof of a ghost, or they may collect eyewitness anecdotes and wrongly consider them to be evidence, or they may have a strange feeling and wrongly reject all other possible causes for it.

The conspiracy theorist commits a Type I error when the government is not, in fact, building prison camps to exterminate citizens, but he comes across something that makes him reject that null hypothesis and conclude that it's happening after all. Perhaps he sees unmarked cars parked outside a fenced lot that has no other apparent purpose, and wrongly considers that to be unambiguous proof, or perhaps he watches enough YouTube videos and decides that so many other conspiracy theorists can't be all wrong. Perhaps he simply hates the government, so he automatically accepts any suggestion of their evildoing.

Finally, the alternative medicine hopeful commits a Type I error when he concludes that vitamins successfully treat a cancer that they actually don't. Perhaps he hears enough anecdotes or testimonials, perhaps he is mistrustful of medical science and erroneously concludes that alternative medicine must therefore work, or whatever his thought process is; but an honest conclusion that the null hypothesis has been proven false is a classic Type I error.

Type II Error: False Negative

Cynics are those who are most often guilty of the Type II error, the acceptance of the null hypothesis when it turns out to actually be false — it turns out that something is there, but you conclude that there isn't. If you actually do have psychic powers but I am satisfied that you do not, I commit a Type II error. The villagers of the boy who cried "Wolf!" commit a Type II error when they ignore his warning, thinking it false, and lose their sheep to the wolf. The protohuman who hears a rustling in the grass and assumes it's just the wind commits a Type II error when the panther springs out and eats him.

Perhaps somewhere there is a house that actually is haunted, and maybe the TV ghost hunters find it. If I laugh at their silly program and dismiss the ghost, I commit a Type II error. If it were to transpire that the government actually is implementing plans to exterminate millions of citizens in prison camps, then everyone who has not been particularly concerned about this (myself included) has made a Type II error. The invalid dismissal of vitamin megadosing would also be a Type II error if it turned out to indeed cure cancer, or whatever the hypothesis was.

Type I and II errors are not limited to whether we believe in some pseudoscience; they're even more applicable in daily life, in business decisions and research. If I have a bunch of Skeptoid T-shirts printed to sell at a conference, I make a Type I error by assuming that people are going to buy, and it turns out that nobody does. The salesman makes a Type II error when he decides that no customers are likely to buy today, so he goes home early, when in fact it turns out that one guy had his checkbook in hand.

Both Type I and II errors can be subtle and complex, but in practice, the Type I error can be thought of as excess idealism, accepting too many new ideas; and the Type II error as excess cynicism, rejecting too many new ideas.

Before talking about Type III and IV errors, it should be noted that these are not universally accepted. Types I and II have been standard for nearly a century, but various people have extended the series in various directions since then; so there is no real convention for what Types III and IV are. However the definitions I'm going to give are probably the most common, and they work very well for the purpose of skeptical analysis.

Type III Error: Answering the Wrong Question

Types III and IV are a little more complicated, but they're just as common as just as important to understand. A Type III error is when you answer the wrong question; and how this usually comes around is when you base some assumption upon a faulty or unproven premise, and so you jump one step ahead and solve a problem that isn't yet the question at hand.

The ghost hunters in the haunted house make a Type III error when they start with the assumption that a ghost makes a cold spot in the room, and so they walk around the haunted house with all sorts of fancy thermometers and collect detailed temperature readings throughout the building. This is great; they've done fine work, and documented it all very nicely, and they correctly reported temperatures. However it is a Type III error, because the question of temperatures has not yet been shown to be relevant, since it has never been established that ghosts affect temperatures.

The conspiracy theorists commit a Type III error when they publish a detailed list of all the locations they've identified as government prison camps. The question is not yet "Where are these camps?" because they skipped over convincingly answering the precedent question of "Do such camps exist at all?" You can produce lists all day long, but until you first prove that each item on the list is actually what you claim it is, the list is of no value.

The vitamin salesman commits a Type III error every time he answers a customer's question about what vitamin is best to take to treat or prevent cancer. He'll no doubt give some such answer and recommend a particular supplement, and perhaps recommend a dosage. This is a Type III error because he's ignoring and skipping the precedent question, which is whether the vitamin in question will treat or prevent the particular cancer in question at all.

Type IV Error: Asking the Wrong Question

While the Type III error is usually committed innocently and with good intentions, the Type IV error — asking the wrong question — often suggests a deliberate deception. By selecting the wrong question to investigate, it's possible to have greater control over the results. Selecting the wrong question is a great way of diverting attention away from the right question.

The producers of ghost hunting TV shows know that they need to produce a program that yields positive results. They also know that they're not going to happen to run into any ghosts or catch anything unexpected on camera. So instead, they frame their program around asking the wrong questions: Can we get interesting readings on our electrical and temperature meters? By structuring their show around the wrong questions, they commit a deliberate Type IV error in order to produce the desired answers.

Conspiracy theorists of all flavors love the Type IV error, as it is one of the most effective tools to build arguments in support of nonexistent phenomena. If the conspiracy theorist wants to convince us that the government is building prison camps to enslave American citizens, it's not necessary to actually ask that question. Instead, ask a whole assortment of related questions that are guaranteed to have positive answers. Are there examples of government corruption? Has the government imprisoned people in the past? Are there laws that permit the government broader powers during times of emergencies? Are there plots of land for which there is no obvious purpose? These questions are all great Type IV errors for the conspiracy theorist.

Similarly, alternative medicine proponents can ask Type IV error questions to suggest that their central claims, which are unevidenced, are actually true. Are there examples of corruption in Big Pharma? Do any natural compounds have therapeutic value? Do scientists rely on grant money? Is medical science big business? Again, these questions are easily answered positively and appear to justify the use of vitamins to treat cancer; when in fact, none of them have any direct relevance to that.

And so there we have it. Four types of reasoning errors, four cases you've heard a thousand times, and will hear a thousand more times. Listen to a few sales pitches, watch a few documentaries on the pseudoscience TV channels, and see if you can spot them. Chances are you will. And, if you can develop enough familiarity with them to spot them when you hear them, you're a leg up on avoiding making these same errors yourself. We all do it, and the better we understand the errors, the better prepared we are to minimize our own such failings.

By Brian Dunning

Please contact us with any corrections or feedback.


Cite this article:
Dunning, B. "A Magical Journey through the Land of Reasoning Errors." Skeptoid Podcast. Skeptoid Media, 14 Feb 2012. Web. 16 Jan 2017. <>


References & Further Reading

Kaiser, H. "Directional Statistical Decisions." Psychological Review. 1 May 1960, Volume 67, Number 3: 160-167.

Kimball, A. "Errors of the Third Kind in Statistical Consulting." Journal of the American Statistical Association. 1 Jun. 1957, Volume 52, Number 278: 133-142.

Mitroff, I., Featheringham, T. "On Systemic Problem Solving and the Error of the Third Kind." Behavioral Science. 1 Nov. 1974, Volume 19, Number 6: 383-393.

Mitroff, I., Silvers, A. Dirty Rotten Strategies: How We Trick Ourselves and Others into Solving the Wrong Problems Precisely. Stanford: Stanford Business Books, 2009.

Neyman, J., Pearson, E. "On the Use and Interpretation of Certain Test Criteria for Purposes of Statistical Inference: Part I." Biometrika. 1 Jul. 1928, Volume 20A, Numbers 1-2: 175-240.

Shermer, M. The Skeptic Encyclopedia of Pseudoscience. Santa Barbara: ABC-CLIO, 2002.


Copyright ©2017 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information






Go Premium






Now Trending...

Male Circumcision

Deconstructing the Rothschild Conspiracy

Mystery at Dyatlov Pass

Is Barefoot Better?

Binaural Beats: Digital Drugs

Lysenko and Lesser Science Grifters

Neanderthals in Present Day Asia

The Black Knight Satellite

Get the Skeptoid Companion Email in your inbox every week, and double your dose of Skeptoid: