You've probably encountered a fallacy whenever you've heard a bad argument and thought, "Uhm, what does one thing have to do with the other?"
When debaters lack supporting evidence but want to win their argument anyway, they'll resort to diversions and deceits. These are fallacies.
Specifically, deceitful fallacies trick people into accepting an argument despite a lack of evidence. And diversional fallacies distract people from the argument altogether.
Fallacies are also accidentally used by debaters with a weak grasp on rational thought.
Let's begin by learning how to refute the most egregious diversion fallacies.
Introducing a point that has little relevance to the argument.
Context: It's okay to introduce a new point if it sheds light on the argument. But it's a fallacy if you introduce a point purely to reduce focus on the argument.
How to refute: "Walk me through every step connecting the significance of your new point to our actual topic. If you can't prove the connection, you're diverting attention from the debate you committed to having — likely because you lack evidence to win."
Example #1: "He talks at a fifth-grade level. We can't take him seriously." Or, "Listen to how shrill and robotic Hillary Clinton is when she talks. It's too hard to listen to her."
Refutation: "How I express my argument has nothing to do with the strength of the argument. So you're purposefully diverting attention and criticizing me because you can't win by criticizing the real argument we’re having. Prove me wrong by presenting all your evidence right now. Go ahead and convince everyone."
We need to pause for a moment to state the obvious: Not everyone is actually interested in arguing. Many people would rather shout their opinion and move on.
So by categorizing the seriousness of an opponent's response, you can identify whether your time is in fact being wasted. If so, either end the debate or help your opponent understand why their response is not constructive.
Here are the classifications for the seriousness of an opponent's response:
(These were identified by writer Paul Graham in this essay.)
You don't have to memorize these. Just recognize that you always want to be debating someone who's presenting a counter-argument to your primary conclusion.
If they're failing to do so, you can tell them: "You're only [describe their type of response]. If you want to have a constructive conversation, you'll need to fully form a counter-argument for my central argument. Again, my central argument is [argument]."
"Small minds discuss people. Average minds discuss events. Great minds discuss ideas."
Let's continue with the deceitful ones used to conceal a lack of evidence.
These are used constantly, so you'll want to develop the skill of identifying them in the heat of a conversation so you can avoid being fooled by others.
Note that each fallacy has a name, and that's for a good reason: Giving names to tricks makes it easier to call them out. It provides critical thinkers with a common language.
Criticizing your opponent despite your opponent's character having no relevance to the argument.
Context: Bad debaters use this fallacy to discredit their opponent and hopefully discredit the opposing argument too. This, of course, is illogical: The soundness of an argument rests on its evidence and logic, not on the character of the person making it.
Example: To shift focus off evidence pertinent to the debate at-hand, Hillary Clinton attacks Barack Obama's voting record integrity. If the current debate topic isn't about Obama's integrity, then her remarks are an intentional deceit.
Note: Not all personal attacks are necessarily fallacies. If the opponent's character is the actual topic of the debate (e.g. Does Obama have the integrity to be a president?), then it's logical to cite Obama's poor voting record as evidence of his weak character.
Refutation: "How does your criticism of me have any bearing on the strength of my evidence? If you can't step-by-step draw the connection between the two, you're resorting to bullying because you know you can't win against my real argument."
Misstating your opponent's argument into something easier to attack. Then attacking that misrepresentation instead of the real argument.
Example: If you argue against increasing your country's military spending because you first want to address the cronyism and inefficiencies in the military, your opponent might purposely misstate your argument as "He doesn't want to defend the country!"
If that were your real argument, they would easily win against it. (Of course we need some defense.) So they'll focus on attacking it to look like they're winning. In the process, they hope onlookers don't realize they've skewed the opposing argument.
Refutation: "Show me a direct quote where I say what you’re claiming I did. If you can't, you're putting your words in my mouth to trick everyone else. You’re doing that because you know what you're claiming I said is way easier to attack than my real argument. Let's see if you can attack my real argument. I’ll repeat it: [argument]."
In the video below, news anchor Tucker Carlson uses diversions and deceits to debate magazine writer Lauren Duca. Try your hand at spotting all his fallacies.
(It takes a minute before they start really getting into it.)
What makes Duca's handling of the situation exemplary is how she doesn't self-deprecate when her credibility as a writer is attacked. She doesn't say, "Yeah, I wish those fluff articles about [celebrity's fashion choices] weren't on my resume."
Instead, she reaffirms how you don't have to fit a political pundit stereotype to have opinions worthy of consideration — and that opinions stand for themselves.
If you're ever bombarded with fallacies like this, follow these steps:
Keep doing this every time they use a fallacy until they realize they're ineffective.
Or, just tell them to go to julian.com/argue. That's the shortlink to this guide.
"Avoiding stupidity is easier and more effective than seeking brilliance."
You do not have to read this entire page right now. Skim the fallacies that seem interesting then continue to the next page. All of these are in the cheatsheet anyway!
Claiming the middle ground between opposing views is inherently a good conclusion.
Example: Slave owners debating anti-slavery activists might have suggested, "A good solution lies somewhere between: We'll only enslave men since they're better at labor."
Context: Your kindergarten teacher was wrong: Everyone's opinion is not equally valid. Middle grounds are not good compromises when one view is exceptionally bad.
Refutation: "This is why your logic doesn't work: If you were pro-slavery and I was anti-slavery, your logic would have us compromise by doing something like only enslaving men. In other words, your conclusion is disproportionately bad, so the middle ground is still bad. We're going to continue this debate until a conclusion supported by logic and sound evidence — not arbitrary compromise — is found."
Stating there are only two reasonable conclusions when in fact more exist.
Context: An opponent will claim there's either the path that is obviously good (e.g. defending ourselves from terrorism) or the path that is bad (e.g. letting terrorists win). Then they'll associate their agenda with the obviously good path, as if the binary form of the argument necessitates a binary yes/no action being taken on their agenda.
Example: To push his tax agenda, Barack Obama paints a false dilemma between either "gutting education" or "stopping corporations from exploiting tax loopholes." Of course, there's another option altogether: the government could just spend far less.
Refutation: "You want us to believe there are only two reasonable paths to consider. But, how about [alternative #1]? Or [alternative #2]? If I can easily keep generating reasonable alternatives to your two paths, you can't keep trying to deceive people into thinking there are only two. You'll need to re-examine the evidence and choose the most logical path, not simply the one you're hoping we choose."
Claiming an opponent's conclusion will lead to a negative chain reaction, and that the conclusion must therefore be disqualified so we can avoid unforeseen risk.
Example #1: "If we allow gay people to marry, next we might legalize polygamy! Then soon we'll allow people to marry their pets!"
Context: Note that your opponent can avoid committing this fallacy if they provide sound evidence — not off-hand predictions — for the likelihood of each consequence.
Refutation: "Your slippery slope logic can be applied to every argument ever made, and is therefore not a good enough response. Here's an example: You support gun control, right? Well why wouldn't allowing people to own assault rifles today eventually lead to allowing them to own rocket launchers in the future? Then maybe missile-equipped drones one day? Then portable nuclear weapons?"
"You might think this is silly, but why would it be? Americans have the constitutional right to bear arms to protect against government tyranny. But the government no longer relies on guns to enforce its laws. They have extremely advanced weaponry that our guns are futile against. We're truly helpless if we don't escalate in proportion."
"Point being, we can't be afraid to take a sound first step today because of the unprovable chain of consequences that might result in the future. If you don't disagree that this first step is sound when isolated from future consequences, then it's the right step to take now. Otherwise, we're living in fear and will never improve our way of life."
If you like this, you're going to love the second recording on the next page too:
[Soundcloud embed of debate including fallacies above this line.]
P.S. Liking this hanbook so far? Then you'll love my upcoming ones, which teach you to speak Chinese, play piano, and write fiction. Go here to get emailed when they're out.
Mistaking a shared trend between two things as a relationship between them.
Example: "Over the last decade, Internet Explorer usage has plummeted. During the same period, murders per capita have decreased by a similar amount. Therefore, Internet Explorer has been causing people to kill each other."
Refutation: "Walk me through every step taking you from what you claim is the cause to what you claim is the result. If you research fully, you'll realize there is no proof of a full connection. So when you try to walk me through it, the moment you reach that link in the chain that's bogus, you have to throw out all subsequent links and start over."
Citing popular opinion as evidence.
Examples: If so many people read their horoscopes, how can you say they're bogus? If so many people claim to see ghosts, how can you say ghosts don't exist?
Refutation: "You know that for centuries people thought the world was flat and that slavery was justified, right? The masses' opinion has never reliably reflected the truth. It reflects cultural trends, hysteria, and the state of scientific progress."
"Even when the truth is found (like it has with climate change being man-made and the lack of a link between autism and vaccines), it still takes decades for people to accept it. So 'what everyone is saying' is not sound evidence for your argument. Find proof."
Dismissing evidence on the grounds that you're skeptical of it.
Example: "It's too hard to believe the complexity of human life could be the result of biological evolution."
The above graph was inspired by the work of authors Krogerus & Tschäppeler.
Refutation: "So your argument boils down to, 'Because I have a hard time believing it, it should automatically be assumed false.' Do you realize how illogical that is? You're stating your personal ability to understand something supersedes expert research."
"Here's the problem with that: Many people have a hard time believing many random things: If you explained quantum mechanics to a child, do you think they'd wrap their head around it? No, and their inability to do so has zero relevance to whether it's true."
"So either come up with evidence to disprove the argument or genuinely spend the effort to learn all the work that went into making this argument sound."
"Consider how humans once thought lightning was a punishment from the Gods. The scientific reality — that lightning is the product of intricate ecological processes — would sound just as mystical to them then as evolution does to you now. They would have been 100% to-the-grave confident that ecological processes as a cause for lightning was simply too wacky and unbelievable to be true."
"So to think scientific understanding won't further progress to spawn additional major epic revelations to modern humanity is to imply you have supernatural foresight into the limits of scientific understanding. Remember, scientific progress is often exponential, which makes it impossible to predict the incredible volume of new insights we'll discover. So avoid repeating the mistakes of ancient humans, and don't be so certain of yourself when you have nothing but a hunch to go on."
Implying your conclusion must be considered true unless your opponent can find evidence to disprove it.
Example #1: "Unless you show me concrete evidence that ghosts don't exist, we should assume there's some truth to it."
Refutation: "A reminder how debates work: Each side argues their position with undeniable proof. Instead, what you're doing is claiming that whatever you're arguing for is somehow true by default unless I can prove you wrong."
"That's an illogical argument structure that can be used to successfully argue anything your mind can make up. For example, 'Unless you show me evidence disproving the prediction that the world ends in 2020, we should assume there's truth it.' Why? Because an ancient tribe said so? What's the difference in the strength of evidence you're sharing now and the evidence the tribe had? There is no difference. Neither of you has proof. Both of your claims are just superstitions. And superstitions aren't arguments — they're hysteria. Come up with proof so you can be taken seriously."
Claiming the validity of evidence can be judged purely by its source.
Example #1: "I won't seriously consider that TV report claiming vaccinations don't cause autism — because the mainstream media is generally untrustworthy."
Example #2: "All this 'GMO foods are safe' nonsense has to be taken with a grain of salt. The scientists doing the research are bought by corporate interests."
Refutation: Point out that how or where evidence is sourced from is irrelevant to its soundness if it passes the scientific standard. However, in truth, this response won't satisfy most people suffering from the genetic fallacy because this fallacy is inherently a dismissal of the merit of the scientific consensus.
So I've written an optional section below demonstrating how to communicate the value of the scientific consensus to skeptics. Unfortunately, even that won't be enough to convince most of them. To reach them on a deeper level — to change how they think — you'll need the bias refutation techniques on the next page.
If you distrust scientists at times, know that scientists also distrust each other. This is a really good thing. It's a community built on skepticism.
In fact, when a scientist makes a misleading claim, he or she is quickly denounced by their peers and suffers an irrecoverable hit to their reputation. And if their crime is grave enough, they can go to prison. Furthermore, counter-evidence setting the record straight is quickly published.
In other words, the consensus (what most say is true) among scientists is almost always the best conclusion to adopt given our current knowledge. Because, unlike suspicions held by the public, scientists didn't emotionally groupthink their way to a conclusion.
Instead, thousands of scientists independently worked diligently to confirm that a claim successfully passed the scientific standard. So, even if a few of them are bought by corporations, there are tens of thousands spread across the world at different universities and companies. This is the power of having a large sample size.
And guess what happens when the scientist community is unsure about something?
They'll tell you. They're quick to admit that a consensus has not yet been reached. Because they don't want to be wrong then publicly humiliated. And they don't want to mislead the public or set science back.
But when they do tell you a consensus has been reached, and then you dismiss it, you're dismissing tens of thousands of smart, skeptical, and independent people who carefully assessed whether something was true. They spent years on this.
So just because you have a "gut feeling" about something, it doesn't mean squat when all these experts have been researching diligently. That said, you may very well have a unique insight into a problem. Or you may have data about a little-known, alternative method. But what you don't have is evidence disproving a widely held scientific consensus. Have some humility.
Remember, "consensus" is the key word here. You can have evidence against a fringe or lesser-validated theory. But you're not going to have it against a consensus. Talk show hosts and your favorite bloggers don't have evidence disproving a consensus either. They're a sample size of one, and everyone wants to shout their own opinions.
So if the scientific consensus is overwhelmingly in favor of global warming or the safety of vaccinations (which it is), the few dissenting scientists you hear from aren't being brave champions of truth. They're the outcasts who were proven false but lacked the humility to accept it. And now they're fearmongering in public because of their biases.
We'll further examine the wacky phenomena of biases on the next page.
If you'd like to learn when it's justifiable to be skeptical, pick up this great book.
"You can lead someone to university, but you can't make them think."
Don't reflexively dismiss someone's argument when they've committed a fallacy.
A fallacy doesn't necessarily render their conclusion unsound. It could still be sound if the fallacy didn't corrupt their central argument or misconstrue the counter-argument.
Recognize that the purpose of a debate is to move both parties closer to the truth. That requires dialogue — not outright dismissal of opposing views.
When you start learning about models (in two pages from now), you'll also learn to slow down your thinking process so you can identify when you're relying on fallacies.
First, it's time to learn why people resort to fallacies in the first place: their biases. If we understand biases, we can change people's minds instead of just out-debating them.