A Magical Journey through the Land of Logical Fallacies - Part 2
The second part of our exploration of logical fallacies.
by Brian Dunning
November 13, 2007
Podcast transcript | Download | Subscribe
Also available in Greek | French
Today we're going to continue with the second part of our exploration through the jungle of logical fallacies, so sharpen up your machete and follow me as we hack and chop our way through all of this mess. If you haven't heard last week's episode, we covered many of the most common logical fallacies and other argumentative devices that are commonly used by proponents of something that can't otherwise be supported by evidence, like many pseudosciences and conspiracy theories. Hopefully, familiarity with these devices will help you to identify them in conversation. And, when you point them out, you often strip your opponent of the tools on which he depends the most. If you're going to have a debate, stick with valid arguments. Don't get caught out by fallacies.
We finished up last week with post hoc rationalizations and slippery slope arguments. And we will now continue with:
The excluded middle assumes that only one of two ridiculous extremes is possible, when in fact a much more moderate middle-of-the-road result is more likely and desirable. An example of an excluded middle would be an argument that either every possible creation story should be taught in schools, or none of them. These two possibilities sound frightening, and may persuade people to choose the lesser of two evils and allow religious creation stories to be taught alongside science. In fact, the much more reasonable excluded middle, which is to teach science in science classes and religion in religion classes, is not offered.
The excluded middle is formally called reductio ad absurdum, reduction to the absurd. Bertrand Russell famously illustrated how an absurd premise can be fallaciously used to support an argument:
Starling says: "Given that 1 = 0, prove that you are the Pope."
Bombo replies: "Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope."
Just keep in mind that if your opponent is presuming extremes that are absurd, he is excluding the less absurd middle. Don't fall for it.
Statistics of Small Numbers
You really have to take a statistics class to understand statistics, and I think the part that would surprise most people is the stuff about sample sizes. Given a population of a certain size, how many people do you have to survey before your results are meaningful? I took half of a statistics class once and learned just enough to realize that practically every online poll you see on the web, or survey you hear on the news or read about in the newspaper, is mathematically worthless.
But it extends much deeper than surveys. Drawing conclusions from data sets that are too small to be meaningful is common in pseudoscience. Listen to Bombo make a couple of bad conclusions from invalid sample sizes:
"I just threw double sixes. These dice are hot."
"My neighbor's a Mormon and he drinks wine, so I guess most Mormons don't really follow the no-alcohol tradition."
"I went to a chiropractor and I feel better, so chiropractic does work after all."
Giving a controversial concept like creationism a new, more palatable name like Intelligent Design is what's called the use of weasel words. Calling 9/11 conspiracies "9/11 Truth" is a weasel word; clearly their movement has nothing to do with truth, yet they give it a name that claims that's what it's all about.
Weasel words are a favorite of politicians. Witness the names of government programs that mean essentially the opposite of what they're named: the Patriot Act, No Child Left Behind, Affirmative Action. By the way certain programs are named, it sounds like it would virtually be criminal to disagree with them.
Weasel words can also refer to sneaky wording in a sentence, like "It has been determined", or "It is obvious that", suggesting that some claim has support without actually indicating anything about the nature of such support.
Fallacy of the Consequent
Drawing invalid subset relationships in the wrong direction is called the fallacy of the consequent. Cancers are all considered diseases, but not all diseases are cancers. Stating that if you have a disease it must be cancer is a fallacy of the consequent.
Listen to how Bombo blames Starling's failure to heal upon his failure to take one particular treatment, without regard for whether that treatment is a valid one for Starling's particular condition:
Starling: "I am dying of bubonic plague."
Bombo: "You did not drink enough wheatgrass juice."
Even assuming that wheatgrass juice was a suitable treatment for anything, it would still not be a suitable treatment for everything, so Bombo's suggestion that Starling's illness is a fallacious consequence for his failure to drink wheatgrass juice.
A loaded question is also known as the fallacy of multiple questions rolled into one, or plurium interrogationum. If I want to force you to answer one question in a certain way, I can roll that question up with another that offers you two choices, both of which require my desired answer to the first question. For example:
"Is this the first time you've killed anyone?"
"Have you always doubted the truth of the Bible?"
"Is it nice to never have to hassle with taking a shower?"
Any answer given forces you to give me the answer I was looking for: That you have killed someone, that you doubt the truth of the Bible, or that you don't shower or bathe. Loaded questions should not be tolerated and certainly should never be answered.
A red herring is a diversion inserted into an argument to distract attention away from the real point. Supposedly, dragging a smelly herring across the track of a hunted fox would save him from the dogs by diverting their attention away from the real quarry. Red herrings are a favorite device of those who argue conspiracy theories:
Starling: "Man landed on the moon in 1969."
Bombo: "But don't you think it's strange that Werner von Braun went rock hunting in Antarctica only a few years before?"
Starling: "9/11 was perpetrated by Islamic terrorists."
Bombo: "But don't you think it's strange that Dick Cheney had business contacts in the middle east?"
Red herrings are fallacious because they do not address the point under discussion, they merely distract from it; but in doing so, they give the impression that the true cause lies elsewhere. The wrongful use of red herrings as a substitute for evidence is rampant, absolutely rampant, in conspiracy theory arguments.
Proof by Verbosity
The practice of burying you with so much information and misinformation that you cannot possibly respond to it all is called proof by verbosity, or argumentum verbosium. To win a debate, I need not have any support for my position if I can simply throw so many things at you that you can't respond to all of them.
This is the favorite device of conspiracy theorists. The sheer volume of random tidbits that they throw out there gives the impression of their position having been thoroughly researched and well supported by many pillars of evidence. Any given tidbit is probably a red herring, but since there are so many of them, it would be hopeless (and fruitless) to respond intelligently to each and every one of them. Thus the argument appears to be impregnable and bulletproof. It may not be possible to construct a cogent argument using proof by verbosity, but it is very easy to construct an irrefutable argument.
Poisoning the Well
When you preface your comments by casually slipping in a derogatory adjective about your opponent or his position, you're doing what's called poisoning the well. A familiar example is the way Intelligent Design advocates poison the well by referring to evolution as Darwinism, as if it's about devotion to one particular researcher. Or:
"And now, let's hear the same old arguments about why we should believe UFOs come from outer space."
"Celebrity television psychic Sylvia Browne tells us in her new book."
If you listen to this podcast, you know that I poison the well all the time. It's one of my favorite devices. But I do it obviously, for the entertainment value, and not as a serious attempt at argument.
Also known as argumentum ad populum (appeal to the masses) or argument by consensus, the bandwagon fallacy states that if everyone else is doing it, so should you. If most people believe something or act a certain way, it must be correct.
"Everyone knows that O.J. Simpson was guilty; so he should be in jail."
"Over 700 scientists have signed Dissent from Darwin, so you should reconsider your belief in evolution."
The bandwagon fallacy can also be used in reverse: If very few people believe something, then it can't be true.
Starling: "Firefly was a really cool show."
Bombo: "Are you kidding? Almost nobody watched it."
Consider how many supernatural beliefs are firmly held by a majority of the world's population, and the lameness of the bandwagon fallacy comes into pretty sharp focus. The majority might sometimes be right, but they're hardly reliable.
That concludes our look at logical fallacies. There are certainly many others, but these are the big ones and then some, and most of the others are just subcategories of some of these. Learn these fallacies, and become handy with them. You'll find that you can easily recognize them in almost every argument someone makes, and then you're well equipped to stop them in their tracks, and require them to instead make a non-fallacious argument. Doing so strips away the bulk of the meat from the arguments of most people who advocate things that aren't evidence-based, and places you handily in a commanding position.
By Brian Dunning
Please contact us with any corrections or feedback.
Cite this article:
Dunning, B. "A Magical Journey through the Land of Logical Fallacies - Part 2." Skeptoid Podcast. Skeptoid Media,
13 Nov 2007. Web.
26 Aug 2016. <http://skeptoid.com/episodes/4074>
References & Further Reading
Kahane, Howard; Cavender, Nancy. Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life. Belmont: Thomson Higher Education, 2006. 155-156.
Morier, Dean; Keeports, David. "Normal science and the paranormal: The effect of a scientific method course on students' beliefs." Research in Higher Education. 1 Jul. 1994, Volume 35, Number 4: 443-453.
Porter, Burton Frederick. The Voice of Reason: Fundamentals of Critical Thinking. New York: Oxford University Press, 2002.
Sagan, Carl; Druyan, Ann. The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House, Inc., 1996.
Urdan, Timothy C. Statistics in Plain English. Mahwah: Lawrence Erlbaum Associates, Inc., 2005.
Walton, Douglas. Informal Logic: A Pragmatic Approach. New York: Cambridge University Press, 2008.
©2016 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information