Argumentative Magic Tricks: Critically Evaluating Fallacies
1. There are a lot of people out there who try to manipulate you. These "hidden persuaders" want to "engineer" your consent (Packard, 1957/2007, p. 200). As one of the early inventors of modern propaganda, Edward L. Bernays (1928/2005) explained, "We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we have never heard of" (p. 37). These hidden persuaders want to trick you into doing something that you probably don't want to do. These hucksters know that "our irrational minds, flooded with cultural biases rooted in tradition, upbringing, and a whole lot of other subconscious factors, assert a powerful but hidden influence over the choices we make" (Lindstrom, 2010, p. 18).
2. Thus, we need to be on guard not only against liars and manipulators, but also on guard against the weaknesses of our own minds. Sure, everyone lies at some point in their lives (Ariely, 2012), but many people are paid to lie for a living: politicians, advertisers, public relations spokespeople, and some news reporters. These people are paid to "phish for phools," as Nobel Prize winning economists George A. Akerlof and Robert J. Shiller (2015) have explained (p. xi). We are susceptible to their manipulation for many reasons. We are ignorant of basic facts. We are ignorant of our brain's flaws. We are enveloped by our subjectivity and culture. And we don't understand the common tricks of liars and propagandists. Thus, we often "allow" ourselves "to be manipulated" (Sachs, 2011, p. 133; Packard, 1957/2007, p. 240). This chapter will give you the basic tools to unmask the tactics of hucksters and con artists who "try to invade the privacy of our minds" by preying upon our ignorance and our psychological weaknesses (Packard, 1957/2007, p. 240).
3. The oldest and most notorious manipulators are politicians (Packard, 1957/2007, p. 171). Relying on ancient traditions of power and authority, Politicians use "magical words," leading people to believe anything that can be spun into a convincing story ("Deeds," 2012, p. 33). The political scientist John J. Mearsheimer (2011) identified seven types of common lies political leaders make to manipulate their own people. He explained that lying is "a useful instrument of statecraft in a dangerous world," and sometimes leaders believe in "noble lies," which are falsehoods used in the name of a good cause (p. 12). News reporters can be another type of propagandist. Partisan media, like Fox News, routinely lie and spin the news in order to manipulate viewers to think and vote in certain ways. Modern news organizations have developed "technologies of mass persuasion" (Sachs, 2011, p. 137), which most often utilize fallacies to alter the truth in order to "distort our most important decision making processes" (p. 142).
4. Advertisers are also in the business of lying. They peddle all sorts of tricks and falsehoods to manipulate consumers into buying expensive products they don't need (Lindstrom, 2011). In his popular handbook on copywriting, Robert W. Bly (2005) explained how copywriters use “false logic” to effectively “manipulate” consumers (pp. 71-74). Advertisers manufacture consumer wants and needs just as effectively as businesses manufacture products. Marketers also pay scientists to not only research the flaws of the human brain, but also to develop specialized techniques to take advantage of these flaws in order to manipulate us more effectively (Lindstrom, 2011; Packard, 1957/2007, p. 31). These corporate scientists know that the human brain is malleable and can be easily effected by tantalizing stimuli that often affect us unconsciously. They also know that we are very vulnerable to addictive substances and habits. Did you know that many popular foods, like Doritos chips, were specifically designed by scientists to make them irresistibly delicious and addictive (Moss, 2013)? Corporate scientists have spent decades studying "the whys of our behavior, so that they can more effectively manipulate our habits and choices" (Packard, 1957/2007, p. 32). Even experts like myself can fall victim to lies and manipulation, as the letter below illustrates.
5. Public relations is another profession built on lies (Ewen, 1996; Tye, 1998). PR representatives use language to spin the truth by either questioning the validity of facts to manufacture doubt, or by inventing fiction and masquerading it as fact. The fictional protagonist in the movie Thank You for Smoking is an exceptional PR man who explains to his son, "If you're paid to be right, then you're never wrong" (Reitman, 2005). The first professional PR man was Edward L. Bernays who wrote several influential books, including Crystallizing Public Opinion (1923) and Propaganda (1928), which explained how PR is the "conscious and intelligent manipulation of the organized habits and opinions of the masses" (Bernays, 1928/2005, p. 37).
6. Bernays (1923/2011) used PR to "sell" the image or brand of a company, not just its products (p. 71). But Bernays wasn't interested in just selling a company, he actually wanted to re-engineer public opinions and behavior. His biographer explained, "Hired to sell a product or service, he instead sold whole new ways of behaving, which appeared obscure but over time reaped huge rewards for his clients and redefined the very texture of American life" (Tye, 1998, p. 52). Bathing at least once a day with soap, the healthy "toasted" delight of cigarettes, and even America's favorite breakfast of bacon and eggs all originated from the mind of Edward L. Bernays (Tye, 1998). One Supreme Court justice called Bernays one of the most insidious "professional poisoners of the public mind" (Tye, 1998, p. 63).
7. You probably don't know that politicians and businessmen have been applying scientific research on human thinking and behavior for over a hundred years. Why? To better manipulate you. In 1895 the French social psychologist Gustav Le Bon published La Psychologie des Foules (The Psychology of the Crowd). This book instructed conservative politicians on how to manipulate and control their citizens, who were then pushing for more political and social democracy. Le Bon's work influenced the research of many American writers, like Walter Lippmann's Public Opinion (1922) and Edward Bernays' Crystallizing Public Opinion (1923) and Propaganda (1928). Both Lippmann and Bernays used their knowledge of social psychology to help the U.S. government and American corporations develop various forms of propaganda, which was used to manipulate public opinion and control the behavior of American citizens.
8. Most people were unaware of these propaganda efforts until journalist Vance Packard (1957/2007) published his expose in 1957 on the public opinion industry, The Hidden Persuaders. Now there are many writers trying to uncover the lies and manipulations of politicians and businesses, including the muckraking journalism of Thomas Frank (2000) and the exposes of neuro-marketer Martin Lindstrom (2010; 2011). Interestingly, Lindstrom turned against his neuro-marketing profession and now warns consumers about the tactics he once used to help businesses manipulate consumers, in such works as Buyology and Brandwashed. Lindstrom (2011) has even demonstrated how companies have gone so far as to use human psychology to literally addict consumers to certain brands and products (p. 61).
9. Lies are often easy to detect if you are well informed about public issues and human psychology. But professional liars know that the majority of Americans are largely ignorant about such matters. Yet, even if you are highly educated about the facts, there are still tricks that professional liars can use to manipulate us. These tricks often affect us unconsciously, so they are much harder to guard against than simple lies. These tricks are called fallacies, which some scientists call "psychological weapons" (Van Der Linden, 2018).
10. A fallacy is a logical sleight-of-hand. It is an argumentative magic trick, which presents a claim as true without any logical reasoning or evidence. In fact, many fallacies are designed to bypass our critical thinking skills in order to control our automatic thinking biases: Fast system 1 thinking overpowering our slow system 2 thinking (Kahneman, 2011, p. 28; see Ch 10, para. 5-9). Fallacies manipulate us. They lead us to make conclusions that are usually false. Fallacies look like part of an argument, but instead of evidence and reasons to back up a claim, there is a trick. Drink our brand of soda because it is refreshing! But what does "refreshing" even mean? How do you know it is refreshing? And what about all the other refreshing drinks? Fallacies are often implicit arguments designed to subliminally affect our judgment without any argument at all. Why do you think images of naked or half-naked women are used so much in advertising (Lindstrom, 2010, p. 177; Packard, 1957/2007, p. 95)? You don't even need an ad campaign or a slogan. Just put a naked woman next to your product and many men will buy it because their hormones override their brains.
11. In order to help you guard against fallacies, this chapter will explain many of the most common. I have broken up these fallacies into four categories. The first is Errors of Reasoning. This type of fallacy has two causes. The first is accidental. As humans, we have many cognitive biases that distort both our perception and our reasoning, some of which were discussed in chapter 7 (See picture below). People think illogically much of the time, and they accidentally reach erroneous conclusions based on bad data and/or bad thinking. But with this knowledge, an unethical person can use these biases against the unwary. Tricksters can manipulate the brain of the ignorant and engineer a false conclusion.
12. The other categories are all intentional tricks. The second kind of fallacy is Evading the Issue. This is an attempt to ignore your argument (because you know you cannot prove it), and instead change the subject to a topic you can win. The third category is Attacking the Opponent. Like the second category, this category seeks to avoid addressing an argument, but it redirects the audience's attention with a specific trick: Attacking the personal character of the opponent. Finally, the last category is Appealing to the Audience. This category of fallacy seeks to affect the audience psychologically so as to manipulate the audience's ignorance or emotions.
13. But beware, these are only a few of the most common fallacies. There are many, many fallacies out there and they are often used in novel combinations that make them very hard to detect. Hopefully after understanding some of the most commonly used fallacies you will be able to better defend yourself against the snares of professional liars and cheats.
A. Errors of Reasoning
14. A Hasty Generalization usually follows most of the rules of good argumentation. This fallacious argument has a claim, evidence, and a conclusion; the problem is that it does not have enough evidence, or the right kind of evidence, to prove a claim true or false. Instead of doing the hard work of finding the best evidence, a person using this fallacy just jumps to a premature conclusion, which, of course, is not justified. Sometimes a hasty generalization can be made with no evidence whatsoever, but that is rare.
15. One of the most common hasty generalizations is the stereotype, an idealized category that supposedly captures the essential characteristics of a whole group of people or things. But the stereotype is usually based on only a few examples, often from personal experience and cultural common sense; in other words, it is a category based on a highly limited amount of evidence and is, therefore, not valid. Often stereotypes have a grain of truth about them; however, it is important to remember that even partially true stereotypes "will cover only some of the truth for part of the time" (Lippmann, 1922/1997, p. 97).
16. Stereotypes usually lead to two other types of fallacies, hasty generalizations that are mirror images of each other: the Ecological Fallacy and the Fallacy of Composition. The ecological fallacy claims a stereotypical essence for a group, and then claims that all members of this group must possess these essential characteristics. To illustrate, all Asians are bad drivers; therefore, because you are an Asian, you must be a bad driver. Soldiers are patriots; therefore, if you are a soldier, you must be a patriot. People who smoke are more likely to die of cancer; therefore, because you smoke, you will die of cancer.
17. The fallacy of composition works in reverse. An individual with particular characteristics is used to make a stereotype of a whole group to which that individual supposedly belongs. The terrorists of 9-11 were Muslim; therefore, all Muslims are terrorists. Those protesters attacked the police; therefore, all protesters are violent. That atheist behaved immorally; therefore, all atheists are immoral. In each case, a whole group of people is labeled based on the actions or characteristics of only a single person.
18. False Cause is another error of reasoning that follows most of the rules of argumentation. This error is usually caused by the ignorance of the arguer, but it can be a deliberate attempt to exploit the ignorance of the audience. A false cause fallacy usually mistakes a correlation or temporal sequence as evidence for a cause. Just because one variable is often linked with another variable, it does not logically follow that one causes the other. Likewise, just because one event often follows another event, it does not logically follow that the first event causes the second. For example, rain and wind often come together, but it would be false to claim that rain causes wind, or vice versa. Losing at least one game comes before winning a championship, but it would be false to claim that losing a game causes winning a championship. In some cases, a false cause fallacy is simply a false cause. Adolf Hitler claimed that the cause of Germany's decline in the 1920s was due to a conspiracy of the Jews. It was not.
19. One specific and often used type of false cause is the Appeal to History. History is not an exact science. Due to the scarcity and subjective nature of most historical documents, it is incredibly difficult, if not impossible, to prove exactly what causes important historical events. Many times an arguer will claim that "history shows" or "history proves" and then use this fallacious appeal to history to supposedly "prove" another claim true. Unless there are copious amounts of historical detail and references to reputable historical studies, be highly skeptical of any historical claim. Most people are completely ignorant of history and invent their own imagined version of the past. Another variant of this type of false cause is the Appeal to Nostalgia. Often arguers will present an exaggerated, idealized, or simply false version of the past and claim it as a factual representation of what really happened. In the United States, one of most common appeals to nostalgia is the idealized "small town America," which is a heavenly place of good people united together by a single set of traditions, values, and practices. Of course, such a place has never existed and never will.
20. A False Analogy is an error of interpretation caused by the ignorance of the arguer or a deliberate attempt to manipulate the audience. (An analogy is a metaphor where two people, objects, or events are described as similar.) If we can understand the meaning of event A, then it will also help us to understand the meaning of event B. An analogy is an attempt to make a fact meaningful, usually based on a system of values, so that an audience knows what to do about a situation or problem. For example, the U.S. invasion of Iraq was going badly in 2004 and 2005, so many media commentators made the analogy that Iraq was either like World War II (a just war of liberation that needed more time to succeed), or they said it was like Vietnam (an unjust war that failed and got worse the longer the armed forces remained). In both cases, the analogy was supposed to help the American public understand the war and act accordingly. However, the war in Iraq was not really like World War II or Vietnam, so these analogies misled the American public.
21. Begging the Question, also called Circular Reasoning, attempts to prove a main claim by simply restating the main claim or by stating another claim that is related, hence, the description of circular. This type of fallacy is usually a result of ignorance rather than an attempt to manipulate an audience because the arguer believes the claim being made is common sense; therefore, they believe there is no need to prove it. For example, after the events of 9-11 in the United States, it was common to hear media reports condemn "terrorists" as evil because terrorism is evil. Such reporting demonstrates circular reasoning since these claims all rest on a faulty stereotype that is not defined or proven. This series of claims does not answer anything; it just raises questions, hence the phrase, "begging the question," which means an ambiguous or argumentative statement that raises a lot of questions needed to clarify or prove it). What are (and are not) terrorists? Why are their actions horrific? Is everything a terrorist does terrorism, and is all terrorism always horrific? What makes terrorists evil? What is evil? Another example is that free trade is good because it allows an unrestricted flow of buying and selling. This simply restates the claim as "proof," but does not actually explain why free trade is good. This trick works because our brain is naturally programed to believe repeated claims it hears over and over again.
22. As with the previous fallacy, Appeals to Authority are based on ignorance and common sense. One of the oldest traditional forms of reasoning is the "do what you're told" command by an authority figure. Most parents practice this type of reasoning with their children, and it is highly popular with businessmen, politicians, and religious leaders. Instead of explaining why something is true and providing evidence, one simply states that a claim is true because an authority figure said so. The Nobel Prize winning physicist Albert Einstein warned that "foolish faith in authority is the worst enemy of truth" (Isaacson, 2007, p. 22).
23. The only element that has changed in the traditional appeal to authority is the source of authority that people accept as common sense. Political and religious leaders have been displaced over the 20th century with scientific and business leaders, and later in the 20th century, there was a decisive swing toward the authority of youth. Celebrities are also seen as authority figures, even though many celebrities have no expertise or special skill. Such celebrities are famous merely for being famous, as political activist Jerry Rubin pointed out: “People respect famous people – they are automatically interested in what I have to say. Nobody knows exactly what I have done, but they know I’m famous” (as cited in Jacoby, 2009, p. 173).
24. One of the earliest and most effective advertising campaigns used an authority figure to endorse a product. Edward L. Bernays was one of the first PR men to employ this tactic. For example, Bernays advised cigarette companies on how to ignore the health risks of smoking by having doctors endorse their products and claim that certain brands were actually healthy (Tye, 1998, ch 2). Bernays also invented the deceptive spin of "toasted" cigarettes, which supposedly were "kind to your throat" because they were "free from harsh irritants" (Tye, 1998, p. 45). The use of the authority of doctors and the red herring of "toasted" (see above) were deliberate lies that deceived audiences by appealing to their ignorance. Once lawmakers established that cigarettes did in fact have health risks, they enacted the Federal Cigarette Labeling and Advertising Act in 1965, which banned advertisements that claimed health advantages for cigarettes. Clever advertisers kept the authority figure of doctors and, in a deliberate red herring ploy (this fallacy is explained later), changed the claim from a cigarette is healthy to it is "less irritating" and "It's toasted."
25. Besides the traditional common sense appeal of authority, we also need to understand that authority figures have psychological power over us. Recent research on the human brain has shown that "people will actually stop thinking for themselves when a person they perceive as an expert offers them advice or direction" (Lindstrom, 2011, p. 177). When we get advice from someone we trust as an expert, our brain basically stops thinking and intuitively accept as true everything that expert or authority figure tells us. We intuitively want to believe them. Thus, we have to learn how to distance ourselves from professional advice so that we can critically analyze what these experts say in order to make sure it is actually true and good advice. This is not an easy task and we have to fight against the natural inclination of our brains to do this. Even when authority is justified, like a doctor, lawyer, or professor, the authority figure must still prove his or her arguments with valid evidence instead of trying to manipulate people on the basis of authority alone. Remember, even experts make mistakes. As finance professor Burton G. Malkiel (2012) stated, “We should not take for granted the reliability and accuracy of any judge, no matter how expert” (p. 168).
26. As a side note, it’s important to recognize that experts are not the only source of authority in our world. Celebrity works in the exact same way. Most people intuitively trust and believe celebrities, as if they were experts or authority figures, which they usually aren’t (Lindstrom, 2011, p. 165). What qualifies a celebrity to know about clothing design, cars, perfume, or any other product? Nothing. Celebrities are just being paid a lot of money to make false claims about a product so you will go out and buy, buy, buy.
27. An Appeal to Probability is an error of reasoning which falsely claims that just because something could possibly happen, then it must happen and/or will happen. This type of reasoning simplifies and exaggerates a cause and effect sequence. Usually an appeal to probability leads to a Slippery Slope fallacy. Like taking one step over the edge of a mountain and quickly falling down the slope, this fallacy simplifies a series of events by stating that if a person takes one step in that direction, then these future events must happen, leading to a horrible end.
28. For example, many anti-drug and anti-smoking campaigns use this fallacy. Smoking can lead to various forms of cancer, and cancer can lead to an early death, so these campaigns often just jump to the conclusion and say smoking will kill you, and usually imply right away. As you know, not all smokers get cancer, and not all people with cancer die an early death. Further, many people who never smoke get cancer, and some people who never smoke and never get cancer will die an early death. Jumping to one conclusion because it fits a theory, stereotype, or agenda is always a fallacy. Such tactics are meant to manipulate people into believing a false claim.
29. The appeal to probability often leads to two other fallacies: the Fallacy of the Single Cause and the False Dilemma. Cause and effect sequences are always highly complex with multiple variables as causes producing a wide range of effects. Often an ignorant or unscrupulous arguer will oversimplify the equation and claim that one cause will necessarily lead to one effect. Drugs = Death. War = Patriotism. Capitalism = Freedom. Atheism = Immorality. Wealth = Happiness. Whenever you see a highly simple equation like this, it is always a fallacy. Life is never so simple. This type of simple equation also creates a false dilemma fallacy. To take one of the examples above: If you are an atheist, then you must be immoral, so if you want to be moral, you must believe in God. Of course, this simple line of reasoning begs more questions: Are all believers in God moral? What makes God moral, and how do we define morality? Whose God? Are all gods moral? Are any gods or their followers immoral? Are any atheists moral?
30. Overly simplified arguments that involve false dilemmas usually employ the fallacy of Reification, which I like to call the fallacy of bullshit. Often people use generalized words without fully understanding what these words mean. All general ideas have very limited meaning because objective reality is defined by details. No two trees, people, or places are exactly the same. People use general words in order to reduce the complex diversity of the world into universal categories. Bullshit categories include: religion, society, capitalism, terrorism, or faith. It can also include any general noun, such as dog, tree, bird, or person. These words are too general and vague; therefore, they are almost meaningless.
31. Take the word dog, for example. While we all instinctively think of a furry creature with four legs and tail, does this general idea fit any specific dog? If I said, “I have a dog,” would you know what my dog looks like or how it behaves? You wouldn’t have a clue. You would only have a vague, generalized idea in your head, which doesn’t correspond to any real dog. Or take the word society. Society is a large collection of diverse individuals, organizations, and institutions, many of which are in open conflict with each other. So what could statements invoking society actually mean? "Our society believes." "Our society must act." "Our society prohibits." When we use such language, we are often referring to general categories, which are an important type of knowledge, butwhen you are talking about categories, you need to tie it down to specific examples; otherwise, it’s nonsensical bullshit that is utterly meaningless.
32. The Naturalist Fallacy is a common error of thinking that is usually the result of ignorance. This fallacy confuses what realistically does exist with what ethically ought to exist –it confuses an is for an ought. Violence is part of nature, and humans are just animals; therefore, it is right to behave violently. War has been a major tool to solve international conflict; therefore, war is just. Everyone cheats and cheating leads to winning; therefore, cheating is the right thing to do. Just because people do something does not automatically make it the right thing to do. This form of sloppy reasoning is routinely used to justify what is called the status quo, what people are already doing, because it would be too hard to imagine something different.
33. The last error of reasoning is the Non Sequitur Fallacy, which is also a common error due to ignorance. This Latin phrase means "it does not follow" – i.e. the conclusion is not logically connected to the original claim of the argument. This fallacy entails any claim or evidence that is not logically relevant to the original claim of the argument. Ignorant people often don’t understand true causes so they explain events either haphazardly or according to cultural common sense. Ask an unintelligent person why the president did or did not do something and they will often come up with some wild conspiracy theory. Why did an earthquake happen? Some might say aliens caused it, or negative energy forces, or the gods, or bad luck. Often, but not always, a non sequitur represents sincere confusion in the mind of an ignorant person who just doesn’t know any better, but sometimes this fallacy can be used deliberately to evade the argument at hand.
B. Evading the Issue
34. Another category is composed of tricks used to evade an argument by changing the subject. This highly effective tool relies on your opponent’s and/or audience’s ignorance. These fallacies generally cannot work on intelligent people because these fallacies are not errors of reasoning; they are tricks designed solely to deceive the foolish. You need to make sure that every supporting claim is logically connected to the thesis, and you need to make sure that all evidence (if there is any) is logically connected to each supporting claim. Sometimes you will run into a dishonest arguer who will throw curve-balls that have nothing to do with the claims at hand. In such a case, bring attention to them; disclaim them as irrelevant and move on.
35. The first trick of evasion is the Red Herring fallacy. It is like the non sequitur, in that both illogically connect unrelated claims or evidence, but the Red Herring is used deliberately to manipulate an audience. The Red Herring leads away from the main claim (which can't be proven) towards another claim or claims (which often can be proven). Supposedly, this fallacy is named after a cultural practice of using smelly fish to divert scent hounds from the pursuit of a criminal suspect. One reason to use a red herring is to avoid talking about a relevant claim that you cannot actually prove true or false. By mentioning, while avoiding, the claim with a red herring, it makes it seem as if you have talked about the issue, when, in fact, you have not.
36. Another reason to use this fallacy is to divert attention away from a relevant claim you cannot prove true or false. Thus, instead of facing a claim you can't prove, you quickly turn toward an unrelated claim you can prove; thus making it appear as if you won the argument, except it’s not the same argument any more. For example, in the debate over health care reform in the U.S., a common tactic of opponents of this reform is to re-frame the debate away from the topic of health care and towards the different, more general topic of personal freedom. In arguing for personal freedom and winning that argument, it may seem as if the argument for health care had lost, when, in fact, the shift was just a red herring diverting attention away from the original argument.
37. Another type of evasion is Shifting the Burden of Proof towards your opponent. The rules of open argument and scientific research are clear: If you make a claim, then you need to prove your claim with evidence and reasoning. The burden of proof is on the person who makes the claim to prove the claim. The burden is not on critics to prove that the claim is not true. So, if your audience is ignorant of these basic rules of argumentation, an unscrupulous debater may declare a claim true without any evidence, unless an opponent can prove the claim wrong. If the opponent cannot supply the evidence in question, then it seems as if the claim is true, but it is not because no evidence has actually proven it.
38. For example, in the classic debate about the existence of God, the argument against this fallacy is one of the strongest claims of atheists. Besides the stories found in holy books, nobody has ever provided direct empirical evidence about the existence of any god, gods, angels, demons, or fairies. Some individuals have claimed to have seen or experienced these beings, but the problem is that no two subjective accounts are alike, which raises questions about the validity of these claims. The same goes for claims about Bigfoot, aliens, elves, trolls, and other mythical creatures. People who make a claim must definitively prove the claim with sufficient evidence from the objective world. If the arguer cannot, then the claim should not be taken seriously, and the doubter of such a claim does not have to prove anything.
39. A final trick of evasion is the Argument Ad Nauseam fallacy. Basically, this trick claims that the argument has been repeated too much to argue anymore; thus, the person using this tactic simply declares the matter settled, true or false, when, in fact, no such consensus has been reached. For example, a religious believer might claim that for thousands of years people have been trying to disprove the existence of a supreme deity, what many call God, and they haven't done so yet because most people still believe in God. This person might claim, we should just move on and realize that God actually exists. Well, the first part of this statement actually begs the questions: Has anyone actually disproved the existence of God? Cannot people still believe even if the belief has in fact been proven wrong? Also, this claim shifts the burden of proof onto the doubter rather than the believer. Shouldn't the believer have to prove that God exists, rather than making the unbeliever prove that God does not exist?
C. Attacking the Opponent
40. The next category is a different set of tricks designed to evade an argument by changing the subject. But instead of an issue-based red herring, the arguer shifts attention to the personal character or words of the opponent. If you can personally discredit the character of your opponent, then you don't actually have to deal with the argument being discussed. Ideally, this fallacy works mostly on the ignorant because they are easily distracted. But because all people have some ethical values they hold dear, it can be difficult for even intelligent people to avoid becoming sidetracked by a juicy attack on someone's character, especially if the attack is about dishonesty or a lack of integrity.
41. One of the most common attacks is Poisoning the Well, although it must be noted that this popular fallacy can also be used in reverse. Poisoning the well involves some disparaging remarks designed to frame a topic, claim, evidence, or even the opponent in a negative light. Our brains have a natural framing bias that operates without our conscious control. When we hear positive or negative words, they will unconsciously shape how we think about the topic being discussed(Kahneman, 2011, p. 88). So if you can smear your opponent's argument or character from the start, then the audience will be predisposed to view anything your opponent says in a negative, distrusting frame of reference. Likewise, this tactic can also work in reverse with praise, thus giving the audience a more favorable disposition. Even if your negative or positive claims are false, they still have an unconscious psychological effect on the audience.
42. When someone attacks the personal character of opponent, doing so is called an Ad Hominem Attack. This is one of the oldest tricks in the book. It does not matter if the attack is true or not because the character of a person has no logical bearing on the truth or falsity of a claim. Only empirical evidence can prove or disprove a claim. Even if a person is a habitual liar, it still does not logically follow that everything the individual says is automatically a lie. Even a liar can sometimes tell the truth. Thus, an ad hominem attack is always, regardless of whether it is true or not, a deceptive attempt to draw attention away from the argument at hand in order to confuse and manipulate the audience. Unscrupulous arguers use this tactic because it works. People frequently get swayed by these types of attacks, even intelligent audiences. We can't help ourselves. Hence, political campaigns in democratic countries seem to get dirty with ad hominem attacks (often called "slinging mud") the closer it gets to election time because this form of appeal really works with undecided audiences. Ad hominem attacks can also be used as a type of red herring, which The Economist calls "whataboutism" ("Muddying the Waters," 2017, p. 30). When a speaker is personally attacked, rather than address the claim, the speaker deflects attention away from their own failings (real or imagined) and redirects blame to others. Whataboutism often goes: Ignore my guilt..."what about" her guilt?
43. Another common attack is called the Straw Man fallacy. Unlike the last fallacy, this attack is not focused on the character of an opponent, but on his or her argument. In an open argument, there is a procedure for criticizing the claims of someone. First, summarize and explain those claims to show the audience that you have looked into the claims, evidence, and conclusions with an open mind to fully understand them. Only then do you begin to criticize the argument. A straw man fallacy takes a short cut. Instead of honestly summarizing the whole argument, the arguer cherry picks certain points, usually taking them out of context, and then altering their meaning.
44. Instead of fairly criticizing the whole argument exactly as it was stated, the arguer, intentionally or not, uses the straw man to misrepresent an opponent’s argument to make it look weaker than it is so that it can be attacked and refuted more easily. The unscrupulous arguer could take short quotes out of a larger context to misrepresent the claim. Or the arguer will overly generalize a specific argument and then attack the misrepresentation for being too general. Sometimes a critic is ignorant and just doesn't understand the argument at hand, so they summarize and criticize their own misunderstanding, rather than the original argument. In rare cases, but it happens, an arguer will simply lie and attribute a claim to an opponent, which was never actually said.
45. The Argument from Silence is our last popular and dishonest evasion tactic. Basically, the arguer takes the silence of an opponent as "evidence" of being right, which, of course, is nonsense. Silence is not evidence of anything. Usually, this tactic will not work in a face-to-face situation because an opponent can always say something to refute a claim. But in print or in a lecture, your opponent appears only in the way you want, so you can always have the last word. It is common for a sneaky arguer to start with a straw man , criticize that misrepresentation, and then rhetorically claim that the opponent could not possibly refute this criticism, thereby, leading to a strong (but illusory) conclusion.
D. Appealing to the Audience
46. The last type of fallacy includes tricks designed to appeal to the psychology of the audience so the audience's mental weaknesses can be manipulated. On the one hand, this type of fallacy is an evasion tactic. Instead of trying to prove a claim, the arguer manipulates the weak minds of the audience to make it seem as if the claim has been proven. But this type of fallacy is also an error of reasoning because it relies on a traditional form of common sense logic. Basically, these fallacies presuppose that if a group of people all agree that something is true, then it must be true. But, of course, this thinking is illogical. A group of people could all agree that the world is flat (as Europeans did for hundreds of years), but this group consensus is in no way proof of anything about the objective world. It is simply an agreement based on subjective opinions.
47. The most basic form of this fallacy is the Appeal to the People or Bandwagon Appeal. This tactic can occur at various places within an argument: before, during, and/or after. Often a smooth arguer will use these tactics in all three places. Now, all audiences like to be spoken to directly, they like to know that their experiences and opinions matter, and they all like to feel they play a part in the argument itself. All arguers need to speak to and include an audience, but there is an honest and dishonest way to communicate, and the line between these two motivations is not always clear. Crossing the line into dishonesty involves trying to manipulate the audience into believing a claim which has no evidence. For example, one might say, "All of us [insert stereotype: Americans, good people, guys, soccer fans] know this issue is wrong, except my opponent." This statement creates an "us" vs. "them" frame, thereby putting the arguer and the audience on the "right" side, and poisoning the well against the isolated opponent on the "wrong" side. The fallacy relies on this oversimplified either/or fallacy to hide the fact that no evidence is presented to prove any claim right or wrong. Clearly, this tactic just manipulates the audience.
48. Politicians and marketers know that people are social animals. We form flocks or herds, just like cows, sheep, or birds. Everyone likes to be part of a group and it is very uncomfortable for most people to stand out or be apart from a group (Lindstrom, 2011, p. 107; Thaler & Sunstein, 2008, ch 3). Because of this social and psychological instinct, people are very susceptible to peer pressure. As marketer Martin Lindstrom (2011) has explained, "We instinctively look to the behavior of others to inform the decisions we make" (p. 108). Peer pressure is one of the easiest ways to manipulate people into believing a claim. If you can convince your audience that everyone else already believes, then peer pressure and the psychological need to conform will do all the work, eliminating the need to prove the claim with evidence.
49. Most appeals to the people involve an Appeal to Ignorance because an intelligent person who knows the rules of argument generally is not susceptible to such tricks. Unscrupulous arguers know that the majority of people in most countries have never been to college, don't know the rules of argumentation, have a limited amount of information about the objective world, and live their lives based on common sense and tradition. To a certain extent, we are all "confident idiots" (Dunning, 2014). An audience like this is easy to manipulate if you know the right buttons to push. Hence, arguers can really say anything at all, no matter how outrageously false or ridiculous, as long as they don't get caught! Of course, getting caught is the major limitation of this highly effective strategy. No audience likes to feel duped. So, the arguer has to be careful to keep bullshit or lies plausible; otherwise, this tactical advantage can turn into a liability. An audience can turn against you if they figure out they are being manipulated.
50. A stronger and safer tactic is the Appeal to Emotions, to which all people are susceptible, smart and ignorant alike. Similar to the bandwagon appeal, this tactic is not necessarily cheating, until it replaces evidence and crosses the line into manipulation. As with the bandwagon appeal, the barrier between good argument and manipulation can be hard to discern. In the first treatise on rhetoric, Aristotle explained how important it was for an arguer to address the emotions of the audience because emotions are an important part of being human. Modern cognitive scientists now understand that emotion plays a significant role in our "intuitive judgments and choices" (Kahneman, 2011, p. 12). Because emotions are hard-wired into our brains, we are naturally susceptible to "emotional contagion," which simply means we are predisposed to be sympathetic to other people and their emotional state of mind (Bloom, 2004, p. 116). We feel the same way as those around us feel. If they are sad, we feel sad. If they are happy, we become happy. Psychologist Daniel Kahneman (2011) has shown how our "emotional attitude...drives [our] beliefs" and our behavior (p. 103). Marketers believe that consumers base around 80 percent of their buying decisions on emotion (Lindstrom, 2011, p. 100). One of the most powerful marketing tools is the "brand," an image engineered with personality and feeling, which appeals to us solely on an emotional level (Packard, 1957/2007, p. 65; Lindstrom, 2010, p. 27).
51. Emotional thinking is part of the automatic, intuitive part of our brain, the part Kahneman called system 1 thinking. We are not fully conscious of how emotions affect our reasoning, which means we cannot fully control our emotions, and so we are vulnerable to manipulation. And strong emotions, like fear or sadness, can easily take over our thinking processes and lead to dangerous decisions. Fear is one of the most effective ways to trick people into accepting a claim, especially if you are selling a product (Lindstrom, 2011, ch 2). Generally, in an open, academic argument with a focus on claims, evidence, and the reasonableness of conclusions, the emotions should play little to no part. Emotions can distract us from system 2 critical thinking, which is something an audience needs to evaluate the truth or falsity of claims. Always be on the lookout for emotional appeals in arguments because, more often than not, they are ploys to manipulate an audience, rather than honest appeals to our humanity.
52. In conclusion, you always need to be on guard against fallacies and lies in arguments. Many unscrupulous arguers just want to win an argument and gain some measure of power over an audience. Most politicians and media personalities actively engage in these underhanded tactics. It is important to remember that when evaluating an argument, you need to focus on the claims, evidence, reasoning, and conclusions of the arguer. If all of these parts are not present or clearly explained, then you should become very skeptical. The absence of one or more of these parts could be a clue that the arguer is using fallacies. Remember, the burden of proof is always on the arguer who makes a claim. Be on the lookout for arguers who do not fully or clearly make an argument, or who engage in fallacies to manipulate an audience. If you encounter such people, and there are many out there, do not take them seriously. In general, such people cannot be argued with because they have no commitment to the truth. An appropriate response is to simply walk away.
53. In the 21st century, "truth" or "facts" have to be actively constructed by critical thinkers through meticulous and rigorous scientific methods. Further, truth alone doesn't do anything. One has to argue for the truth in open debate in order to convince a skeptical public. Debating with others about truth means both arguing for the truth and demonstrating it with valid logic and evidence. It also means arguing against false opinions, manipulations, and lies. 21st century literacy entails not only being able to construct knowledge with scientific methods, but also openly arguing with diverse publics to explain and prove the truth. We will discuss the process of how to make an effective argument in the next chapter.
References & Further Reading
Abrams, M. H. (1953). The mirror and the lamp: Romantic theory and the critical tradition. Oxford, UK: Oxford University Press.
And man made life. (2010, May 22). The Economist. Retrieved Dec. 3, 2012, from www.economist.com
Akerlof, G. A., & Shiller, R. J. (2015). Phishing for phools: The economics of manipulation and deception. Princeton: Princeton University Press.
Ariely, D. (2008). Predictably irrational: The hidden forces that shape our decisions. New York: Harper Perennial.
Aristotle. (1995). Rhetoric. In Jonathan Barnes (Ed.), The complete works of Aristotle: The revised Oxford translation, Vol 2. (pp. 2152-2269). Princeton, NJ: Princeton University Press.
Baskin, P. (2012, Oct 1). Misconduct, not error, found behind most journal retractions. The Chronicle of Higher Education. Retrieved Dec. 3, 2012, from www.chronicle.com
Beach, J. M. (2012). Kenneth Burke: A sociology of knowledge: Dramatism, ideology and rhetoric. Austin, TX: West by Southwest Press.
Berlin, I. (2000). Historical inevitability, In The proper study of mankind. New York: Farrar, Straus and Giroux.
Bernays, E. L. (2011). Crystallizing public opinion. Brooklyn, NY: Ig Publishing. (Original work published 1923)
Bernays, E. L. (2005). Propaganda. Brooklyn, NY: Ig Publishing. (Original work published 1928)
Bloom, P. (2004). Descartes' baby: How the science of child development explains what makes us human. New York: Basic Books.
Bloor, D. (1991). Knowledge and social imagery. 2nd ed. Chicago: University of Chicago Press.
Bly, R. W. (2005). The copywriter’s handbook: A step-by-step guide to writing copy that sells. 3rd ed. New York: Owl Books.
Burke, K. (1969). A rhetoric of motives. Berkeley, CA: University of California Press. (Original work published 1950)
Burke, K. (1973). The philosophy of literary form. Berkeley, CA: University of California Press. (Original work published 1941)
Chang, K. (2012, Sept 24). Bias persists for women of science, a study finds. The New York Times. Retrieved Dec. 3, 2012, from www.nytimes.com
Cole, J. R. (2009). The great American university: Its rise to preeminence, its indispensable national role, and why it must be protected. New York: Public Affairs.
Crawford, M. B. (2009). Shop class as soulcraft: An inquiry into the value of work. New York: Penguin.
D'Andrade, R. (2002). Cultural Darwinism and language. American Anthropologist 104(1): 223-232.
Deeds, not words. (2012, Sept 15). The Economist. Retrieved Dec. 3, 2012, from www.economist.com
Demos, J. (2008). The enemy within: 2,000 years of witch-hunting in the western world. New York: Viking.
Dennett, D. C. (2003). Freedom evolves. New York: Viking.
Deutsch, D. (1997). The fabric of reality: The science of parallel universes - and its implications. New York: Allen Lane.
Diamond, J., & Robinson, J. A. (Eds.). (2010). Natural experiments of history. Cambridge, MA: Harvard University Press.
Dunning, D. (2014, Oct 27). We are all confident idiots. Pacific Standard: The Science of Society. Retrieved Nov 2 2014 from www.psmag.com
Eagleton, T. (1991). Ideology: An introduction. London: Verso.
Emerson, R. W. (1957). Experience. In Selections from Ralph Waldo Emerson. Boston: Houghton Mifflin. (Original work published 1844)
Ewen, S. (1996). PR! A social history of spin. New York: Basic Books.
Experimental psychology: The roar of the crowd. (2012, May 26). The Economist. Retrieved Dec. 3, 2012, from www.economist.com
Feyerabend, P. (2010). Against method. 4th ed. London: Verso. (Original work published 1975)
Finkbeiner, A. (2006). The Jasons: The secret history of science's postwar elite. New York: Viking.
Flanagan, O. (2007). The really hard problem: Meaning in a material world. Cambridge, MA: MIT Press.
Flanagan, O. (2011). The Bodhisattva's brain: Buddhism naturalized. Cambridge, MA: MIT Press.
Frank, T. (2000). One market under God: Extreme capitalism, market populism, and the end of economic democracy. New York: Anchor Books.
Frankfurt, H. G. (2005). On bullshit. Princeton: Princeton University Press.
Freedman, D. H. (2010, Nov). Lies, damned lies, and medical science. The Atlantic. Retrieved Dec. 3, 2012, from www.theatlantic.com
Galbraith, J. K. (2001). The concept of the conventional wisdom. In The essential Galbraith. New York: Mariner Books.
Galileo. (1957). The assayer. In Discoveries and Opinions of Galileo (pp. 217-280). New York: Anchor Books. (Original work published 1623)
Gaukroger, S. (2001). Francis Bacon and the transformation of early-modern philosophy. Cambridge, UK: Cambridge University Press.
Gay, P. (1995). The enlightenment: The rise of modern paganism. New York: W. W. Norton.
Geertz, C. (2000). Ideology as a cultural system. In The interpretation of cultures. New York: Basic Books. (Original work published 1973)
Geertz, C. (2000). Common sense as a cultural system. In Local Knowledge. New York: Basic Books. (Original work published 1983)
Gray, J. (1995). Enlightenment's Wake. London: Routledge.
Greene, J. D. (2002). The terrible, horrible, no good, very bad truth about morality and what to do about it. Unpublished doctoral dissertation, Princeton University, Princeton.
Gross, C. (2012, Jan 9/16). Disgrace. The Nation. Retrieved Dec. 3, 2012, from www.thenation.com
Hammersley, M., & Atkinson, P. (2003). Ethnography: Principles in practice. 2nd ed. London: Routledge.
Huff, D. (1993). How to lie with statistics. New York: WW Norton & Company.
Hume, D. (1888). Treatise of human nature. Oxford, UK: Clarendon Press. (Original work published 1739)
Igo, S. E. (2007). The averaged American: Surveys, citizens, and the making of a mass public. Cambridge, MA: Harvard University Press.
Isaacson, W. (2007). Einstein: His life and universe. New York: Simon & Schuster.
Jacoby, S. (2009). The age of American unreason. Revised Edition. New York: Vintage.
Journalistic deficit disorder. (2012, Sept 22). The Economist. Retrieved Dec. 3, 2012, from www.economist.com
Judson, H. F. (2004). The great betrayal: Fraud in science. New York: Harcourt.
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Kant, I. (1994). Critique of pure reason. London: Everyman's Library. (Original work published 1781)
Kihlstrom, J. F. (2013, Spring). Threats to reason in moral judgment. The Hedgehog Review, 15(1), 8-18.
Kirsch, I. (2010). The emperor's new drugs: Exploding the antidepressant myth. New York: Basic Books.
Klee, R. (1999). Introduction. In R. Klee (Ed.), Scientific inquiry: Readings in the philosophy of science (pp. 1-4). Oxford: Oxford University Press.
Klein, J. (2003). Francis Bacon. Stanford Encyclopedia of Philosophy. Retrieved Dec. 3, 2012, from www.plato.stanford.edu/ entries/francis-bacon/
Kuhn, T. (1996). The structure of scientific revolutions. 3rd ed. Chicago: University of Chicago Press.
Levitt, S. D., & Dubner, S. J. (2009). Freakonomics: A rogue economist explores the hidden side of everything. New York: Harper Perennial.
Lindblom, C. E. (1990). Inquiry and change: The troubled attempt to understand and shape society. New Haven: Yale University Press.
Lindblom, C. E., & Cohen, D. K. (1979). Usable knowledge: Social science and social problem solving. New Haven: Yale University Press.
Lindley, D. (2008). Uncertainty: Einstein, Heisenberg, Bohr, and the struggle for the soul of science. New York: Anchor Books.
Lindstrom, M. (2011). Brandwashed: Tricks companies use to manipulate our minds and persuade us to buy. New York: Crown.
Lindstrom, M. (2010). Buyology: Truth and lies about why we buy. New York: Crown.
Lippmann, W. (1997). Public opinion. New York: Free Press. (Original work published 1922)
Malkiel, B. G. (2012). A random walk down wall street: The time-tested strategy for successful investing. New York: W. W. Norton & Company.
Mayr, E. (1997). This is biology: The science of the living world. Cambridge, MA: Harvard University Press.
Mearsheimer, J. J. (2011). Why leaders lie: The truth about lying in international politics. Oxford, UK: Oxford University Press.
Morgan, E. S. (1988). Inventing the people: The rise of popular sovereignty in England and America. New York: W. W. Norton.
Moss, M. (2013, Feb 20). The extraordinary science of addictive junk food. The New York Times, Retrieved Feb 21 from www.nytimes.com
Muddying the waters. (2017, Oct 28). The Economist, p. 30.
Norman Borlaug. (2009, 19 Sept). The Economist. Retrieved Dec. 3, 2012, from www.economist.com
Nussbaum, M. C. (1997). Cultivating humanity: A classical defense of reform in liberal education. Cambridge, MA: Harvard University Press.
Oreskes, N., & Conway, E. M. (2010). Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. New York: Bloomsbury Press.
Packard, V. (2007). The hidden persuaders. Brooklyn, NY: Ig Publishing. (Original work published 1957)
Pinker, S. (1997). How the mind works. New York: W. W. Norton.
Pinker, S. (2002). The blank slate: The modern denial of human nature. New York: Penguin.
Plato. (1997). Republic. In J. M. Cooper (Ed.), Plato: Complete works (pp. 971-1223). Indianapolis, IN: Hackett Publishing.
Polanyi, M. (1962). Personal knowledge: Towards a post-critical philosophy. Chicago, IL: University of Chicago Press.
Polanyi, M. (1964). Science, faith and society: A searching examination of the meaning and nature of scientific inquiry. Chicago, IL: University of Chicago Press.
Popkin, S. L. (1994). The reasoning voter: Communication and persuasion in presidential campaigns. 2nd ed. Chicago, IL: University of Chicago Press.
Popper, K. (2002). The logic of scientific discovery. London: Routledge. (Original work published 1959)
Popper, K. (1979). Objective knowledge: An evolutionary approach. Revised edition. Oxford: Oxford University Press.
Reitman, J. (Director & Writer). (2005). Thank you for smoking. United States: Fox Searchlight Pictures.
Rorty, R. (1979). Philosophy and the mirror of nature. Princeton, NJ: Princeton University Press.
Sachs, J. D. (2011). The price of civilization: Reawakening American virtue and prosperity. New York: Random House.
Sagan, K. (1996). The demon-haunted world: Science as a candle in the dark. New York: Random House.
Sen, A. (2009). The idea of justice. Cambridge, MA: Harvard University Press.
Shapin, S. (2010). Never pure: Historical studies of science as if it was produced by people with bodies, situated in time, space, culture, and society, and struggling for credibility and authority. Baltimore, MD: Johns Hopkins University Press.
Shenkman, R. (2008). Just how stupid are we? Facing the truth about the American voter. New York: Basic Books.
Solow, R. M. (1997). How did economics get that way and what way did it get? In T. Bender & C. E. Schorske (Eds.), American academic culture in transformation (pp. 57-76). Princeton, NJ: Princeton University Press.
Taubes, G. (2007). Good calories, bad calories. New York: Anchor.
Thaler, R. H. (2015). Misbehaving: The making of behavioral economics. New York, W. W. Norton.
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. New Haven: Yale University Press.
The death of facts in an age of truthiness. (2012, April 29). National Public Radio. Retrieved Dec. 3, 2012, from www.npr.org
Toulmin, S. (1958). The uses of argument. Cambridge, UK: Cambridge University Press.
Toulmin, S. (1961). Foresight and understanding: An inquiry into the aims of science. New York: Harper.
Toulmin, S. (2001). Return to reason. Cambridge, MA: Harvard University Press.
Toulmin, S., Rieke, R., & Janik, A. (1979). An introduction to reasoning. New York: Macmillan.
Tye, L. (1998). The father of spin: Edward L. Bernays & the birth of public relations. New York: Crown.
van der Linden, S. (2018, April 10). Psychological weapons of mass persuasion. Scientific American. Retrieved April 12, 2018 from www.scientificamerican.com
Watters, E. (2013, March/April). We aren’t the world. Pacific Standard, 46-53.
Wheelan, C. (2013). Naked statistics: Stripping the dread from the data. New York: W. W. Norton.
Zimmer, C. (2012, April 16). A sharp rise in retractions prompts calls for reform. The New York Times. Retrieved Dec. 3, 2012, from www.nytimes.com
To cite this chapter in a reference page using APA:
Beach, J. M. (2013). Title of chapter. In 21st century literacy: Constructing & debating knowledge. Retrieved date from www.21centurylit.org
To cite this chapter in an in-text citation using APA:
(Beach, 2013, ch 7, para. #).
© J. M. Beach 2013, Revised 2016