Chapter 11

Two Types of Argument: Common Sense vs. Science


1. Although the search for knowledge and the creation of technology is thousands of years old, science as we know it is a relatively recent innovation in human history.  Modern science was developed in 19th century German research universities and spread from there across the globe.  But the various scientific disciplines and their research methods did not become clearly defined and professionally practiced until the middle of the 20th century.  While the invention of modern science is one of the most important events in human history, few people understand what science is, let alone use its methodology to make daily decisions.  While science delivers the most reliable form of knowledge, it is difficult to practice and not practical for everyday use.  Even trained scientists would find it difficult, cumbersome, and costly to use scientific practices in daily life. 

2. The journalist Vance Packard (1957/2007) explained, "It would be a dreary world if we all had to be rational, right-thinking, non-neurotic people all the time" (p. 240).  Even if it were possible to use science for every daily decision, this complex and arduous method would lead to paralysis or “total absurdity” (Berlin, 2000, p. 168).  The Nobel Prize winning psychologist Daniel Kahneman (2011) warned, "Continuous [scientific] vigilance is not necessarily good, and it is certainly impractical" (p. 28).  Social scientist Charles E. Lindblom went so far as to say that "examining everything is a path to madness" that "goes far beyond human capacity" (Lindblom, 1990, p. 43).  But examine we must if we are to find the best knowledge to improve our lives, our societies, and the world at large. 

3. Thus, we need to know not only when to use science to critically analyze the claims of others, but also how to critically analyze claims using scientific methods.  But how can we think rationally and make informed decisions?  Most people rely on personal experience combined with the common sense of cultural traditions to make daily decisions.  While this method can work fairly well some of the time, often this method leads to false conclusions and is, therefore, dangerous, especially when making moral decisions (Kahneman, 2011; Greene, 2002).  Many people are "grossly ignorant," "incompetent," and do not gather enough information to make "rational" decisions (Shenkman, 2008, pp. 3, 50; Kahneman, 2011; Popkin, 1994, p. 42).  In some ways, we are all "confident idiots"  (Dunning, 2014).  Rather than thinking critically about the world to investigate the truth, most people rely on what one social scientist calls "gut rationality," i.e. listening to your "gut," or what some people call instincts or intuition (Popkin, 1994, p. 44).  Gut rationality includes the opinions of established authorities, tradition, cultural myths, and common sense. 


Handout:  Common Sense vs Science in Arguments


4. Take, for example, the field of medicine.  Most cultures at some point in their history have believed in fictitious entities as the root cause of disease, such as evil spirits, demons, ill humors, black bile, yellow bile, and phlegm.  For almost 2,000 years in Europe, it was standard "common sense" practice to poke patients with a knife, or place leeches on them, in order to bleed the mystical "ill humors" out of the sick body.  This so-called medical treatment probably killed more patients than it cured.  When scientists discovered the new germ theory of medicine in the 19th century, the proposition of micro-organisms, like bacteria, was seen as an absurd belief in ghosts.  The medical establishment lambasted it as quackery.  The modern theories of evolution, relativism, and global climate change have met with similar disbelief and outrage.  But this type of resistance to science is nothing new.  For thousands of years people believed the Earth was flat and at the center of a divinely ordered universe.  When scientists and explorers postulated a round Earth, which revolved around the sun, these theories were derided as crazy nonsense.


Science vs. Common Sense: The Daily Show Explains


5. Every culture has its official "common sense" beliefs, which the ancient Greeks called orthodoxia, and what social scientists nowadays call "conventional wisdom" (Galbraith, 2001, p. 18).  Philosophers and social scientists have known for over a hundred years that the conventional wisdom is almost always wrong, often dangerously so.  It is important to understand the common sense method of thinking because it leads to conventional wisdom.  We need to guard against this type of thinking in our own lives, and we need to be able to understand and criticize this type of thinking in others, especially when it is used to influence the law or public policy.  As we already discussed, all people subjectively classify their world through the bias of personal experience, cultural traditions, and common sense beliefs.  Sometimes our conventional wisdom is benign, but often it can lead to dangerous situations (Greene, 2002).  For example, some people hold their common sense religious beliefs so dear that they refuse to see a doctor when they are sick, believing their god will automatically cure them.  Others use their common sense beliefs to act violently against a hated "other." 

6. To illustrate, prejudiced people do not see homosexuals as human.  Instead, these intolerant people classify homosexuals as "evil," "perverted," "demonic," or "unnatural" creatures.  Such pejorative labels often lead to prejudicial treatment and sometimes violent abuse.  Likewise, one hundred years ago, white Europeans classified black skinned Africans as animals or some other sub-human entity, a supposition which led to the brutal killing and enslavement of millions.  In all of these cases, a "common sense" belief led to biased claims, poor reasoning, and false conclusions – which, of course, also led to hateful actions and violence.   

7. Common sense arguments use a form of deductive reasoning.  This form of thinking is based on the authority of an assumed truth, which is usually based on an institution or cultural tradition (Greene, 2002).  This assumed truth is then used to arrive at other truths.  For example, witches are evil.  If witches are evil, then it is necessary and right to kill them.  That girl is a witch.  Therefore, it is necessary and right to kill her.  These claims are all based on the assumed common sense "truths" that witches exist, that witches are evil, and that we all know what witches look like.  In Europe, these truths were based on larger common sense truths embedded in Christianity and various cultural traditions of witch hunting.  For hundreds of years, this relatively simple belief in witches was used to murder thousands of innocent women in Europe and America (Demos, 2008).  An example such as this should make it clear how dangerous common sense can be, especially if you are a despised or distrusted minority. 

8. Common sense is also dangerous because you cannot argue against it.  Most people don't know why their self-evident truth is "true" – they simply believe it.  As psychologist Daniel Kahneman (2011) explains, we all have "answers to questions that [we] do not completely understand, relying on evidence that [we] can neither explain nor defend" (p. 97).  If you pressed people to explain why they believe their common sense is true, you would get "bullshit" answers.  Bullshit is the use of generalized language to hide the fact that we don't know what we're talking about (Frankfurt, 2005, pp. 46-47). 

9. But be careful when pointing out bullshit.  Generally people don't like criticism or being called a liar.  Throughout human history, many have been put to death for criticizing the common sense or lies of their community.  Perhaps the most famous example was the philosopher Socrates.  He was killed several thousand years ago because he dared to question the common sense held by the citizens of ancient Athens.  For hundreds of years in Europe, the Roman Catholic Church tortured and burned heretics at the stake for the crime of questioning the common sense truth of official doctrines of the Church.  But the Catholic Church’s suppression of dissent was not unique.  The practice of killing or banishing critics has been widespread around the world.

10. The scientific method for inductive arguments is a much better method than common sense for understanding the world and for finding reliable knowledge.  Science uses inductive reasoning.  This way of thinking links empirical evidence from the objective world to an argumentative claim in order to prove it true.  Empirical evidence consists of facts that can be verified with the senses (tasting, seeing, touching, hearing, and smelling).  Or empirical evidence can be facts that are verified with an instrument, the results of which can be verified with our senses (instruments such as telescope, stethoscope, compass, thermometer, or video recorder).  Inductive arguments make claims about the objective world.  Truth is determined based on evidence.  The evidence corresponds with reality and proves that the world objectively exists beyond our own subjective experience.

11. For example, I might claim that it is raining outside today.  In order to prove this claim true, I would need to supply evidence of rain.  Some types of evidence are better than others.  I could point to the weather forecast, but these reports are often unreliable.  I could interview a person who said he or she saw rain, but the individual could be lying.  I could go outside and take a picture, but how do you know I took the picture today?  I could show you a published news report of rain, but is it accurate for your particular location?  Of course, the best evidence would be for me to take you outside and let you see and feel the rain yourself.  Short of that, I could interview five or ten people, and if they all agreed that it’s raining, then their collective response would also be good evidence.  I could also video record the rain at a local landmark with date verification. 

12. The strength of my argument rests on the validity of my evidence.  Validity is a scientific concept, which determines the quality of evidence.  It seeks to measure how well a researcher logically connects evidence to a claim.  Relying on an internet weather forecast is highly unreliable and would most likely be considered invalid evidence.  Relying on multiple interview subjects, a local newspaper story, and a video recording combined together would be considered highly valid evidence.  Using multiple types of evidence like this to confirm a single phenomenon is called triangulation.  The method of triangulation is considered one of the most valid research strategies because it does not rely on only a single method.

13. But inductive arguments based on evidence are not perfect forms of reasoning because they are still grounded on some form of bias, which is connected to our human need to make life meaningful – and meaning is not an objective quality of the physical world (Flanagan, 2007).  We do not see the world directly, even scientists.  We all perceive through our values or world views.  For scientists, all observation is grounded in a scientific theory.  As one scientist explained, "all observation in science is 'theory-laden,'" which means that scientists use their chosen theory to make meaning of the data they observe (Goodstein, 2010, p. 9).


Handout: How Do You Know?  The Seven Stages of Knowledge


14. While the problem of meaning and its resulting bias is the root problem of all human knowledge, it can be controlled to a certain extent.  It is also important to remember that not all biases are bad.  As discussed earlier, everyone has biased beliefs based on common sense, principles, rational arguments, and/or personal experience.  Scientists have bias based on scientific theories.  This bias helps us.  Bias allows us to understand our sensory observations and gives that information meaning.  Bias also helps us make decisions fast because it creates an automatic reflex.  But we need to be continually aware of our bias to make sure it is reasonable and justified; often it is not.  

15. Burning witches because your priest said they are evil is not reasonable.  Burning wood to stay warm in the snow is reasonable.  The bias in the first example is blind devotion to tradition and authority, which leads you to kill another person.  The bias in the second example is a rational belief that living is better than dying and that heat keeps a body warm in the snow.  Scientists and academic researchers try to understand their bias in order to judge its rationality.  They will also try to control it when they conduct research so as to be objective and fair-minded. 

16. If our bias is reasonable, then we need to fully disclose it as the foundation of our knowledge so that we can explain why it is warranted in our argument.  Take for example my claim about rain.  This knowledge claim was based on my principled beliefs in the value of truth, empirical evidence, and open argument.  I would not go through the time and trouble of assembling valid evidence and constructing a reasonable argument about the rain unless I believed that open, empirical arguments were good, and further, that knowing the truth about the objective world is also good. 

17. But these are cultural values with which some may not agree.  Some people may believe that winning an argument is good, regardless of whether it is true or not.  That would be a different principled bias.  Such people would, therefore, not care about empirical evidence or open argument, and they might instead rely on manipulation and lies to trick you.  Others might believe that God can make rain or withhold rain and that we should just believe whatever a holy person says about the presence or absence of rain.  This belief would be another principled bias.  People with such a belief would not care about empirical evidence or arguments of any kind, and they would instead try to explain the authority of their religion. 

18. An honest speaker who is interested in the truth about the objective world needs to make an open argument, which should disclose and explain the speaker's bias so that the audience can decide if it is reasonable.  The disclosure of bias, often in the form of a foundational principle, is called a "warrant" (Toulmin, 1958, p. 98; Toulmin, Rieke, & Janik, 1979, p. 26).  Most warrants are implicit in an argument because most arguers are not fully aware of their bias and the common sense foundation of their arguments (Toulmin, 1958, p. 98).  And sometimes a warrant is deliberately kept secret so as to better manipulate an audience. Scientists, academic researchers, and honest public debaters strive to make open arguments; thus, they will always state an explicit warrant at the start of an argument.  But it is important to understand that "warrants are not self-validating" (Toulmin, Rieke, & Janik, 1979, p. 58).  They need to be fully explained and supported by a reasonable argument, which is sometimes called "backing."


Handout:  Types of Warrants & Claims


19. Explicit warrants can take three basic forms: (1) The philosophical principle, (2) the legal principle or precedence, and (3) the scientific theory.  The first two are value principles, which seek to argue that a certain value is good or bad based on the qualities of the principle and/or on how that principle has been used in the past.  Often value principles are explained as universal principles that are good or bad at all times in all places.  These principles are derived from the beliefs of a culture or from a specific legal tradition, or sometimes both.  A legal warrant will cite a law or legal principle, which many times contains a sacred cultural value.  Thus, there is often a blending of philosophical and legal warrants.  One example is the American Constitution, which enshrined the principle of liberty and justice (among other values), so when an American discusses freedom they often point to the Constitution as both the source of that value and also as a legal guarantee of that value.  Below is a government sign that is doing something similar.  Drivers are reminded that littering is not only against the law, but it is also an "awful" practice, which is a common sense expression of badness.  The sign assumes that you already know the common sense values that make littering awful, such as damaging the environment (the principle of environmentalism) or ruining a beautiful landscape (the principle of beauty).


A philosophical and legal warrant

A philosophical and legal warrant

20. The third type of warrant is different.  A scientific theory is a provisional model about how the objective world works, which has to be tested with experiments and validated with sufficient evidence.  Different scientific theories lead to different methods for collecting data, which in turn, will lead to different types of conclusions about the objective world.  Take for example the general difference between a "fundamental" scientist and an applied "forensic" scientist.  The fundamental scientist will use the best theory that explains the phenomenon being studied, but a forensic scientist has to be always focused on specific laws and court procedures, and will therefore only choose theories and methods that will hold up in a court of law.  And unlike the other two types of warrants, most scientists will readily admit that their theory might be wrong, and they will actively look for disconfirming evidence that could prove it false. 

21. Let us examine three scenarios to explore how these three warrants could be used in an open argument.  Let us say that a chemical factory next to a small town is leaking a toxic pollutant into the local water supply.  Someone might make the warrant that the principle of health is both a public and personal good (we all want to live healthy lives); thus, a toxic pollutant effecting the water supply would be bad, and if it is bad then it should be stopped and cleaned up.  In an American court of law, precedence is important.  If a court has ruled in principle one way, then courts in the future have to rule that same way, unless the Supreme Court changes the law.  So, if law code X was used in the past to convict companies guilty of environmental pollution, then a lawyer would direct a judge that this same law code X should be used in the current case.  And if the previous penalty of law Code X was 100% of clean up fees plus damages to victims, then a judge should use that same established principle to penalize the current company if it is found guilty. 

22. Finally, a scientist might be brought into the case to ascertain whether or not the contaminating chemicals in the water are actually toxic or not.  This scientist will use a particular theory about toxicity to study the particular chemicals.  The scientist might explain in court that theory Z predicted certain outcomes, evidence was found confirming these outcomes, and four separate experiments were conducted under laboratory settings to make sure the results could be replicated.  Therefore, the empirical evidence confirms the theory, and the theory verifies that the current pollution is toxic.  Thus, the judge would then use this argument to declare the company guilty under law Code X and charge fines according to past precedent.

23. The first part of an open argument is making a thesis claim, which is the main claim of an argument.  Then the foundational warrant for this thesis claim needs to be analyzed and justified with good reasons.  Next, the supporting claims must be stated and organized.  Each supporting claim will form a small argument with its own array of evidence leading to logical conclusions.  All of these supporting claims will be connected back to the thesis so as to prove the main claim of the argument true or false (or somewhere in between).  The more supporting claims and evidence, the stronger the argument.  Supporting claims are often called the "grounds" of an argument because they establish the truth or falsity of the main claim, much like stakes pounded in the ground stabilize a tent, keeping it up even in strong wind.    

24. When constructing grounds, you need to know that there are three types of claims, each of which needs different kinds of evidence and reasoning to prove it: (1) claims of fact, (2) claims of meaning, and (3) claims of value.  The easiest type to prove is the claim of fact.  This claim purports to describe or explain part of the objective world.  In order to prove such a claim, there needs to be empirical evidence that would correspond to objective reality, making a claim a fact.  The second type of claim is more difficult to prove because it does not directly deal with facts.  The claim of meaning seeks to prove a certain interpretation of the facts because facts do not inherently mean anything.  A rock is a rock, but when does a rock become an obstacle, or valuable, or toxic, or ornamental, or a weapon?  When does fighting between nations become a war?  When does a relationship end?  When does a brain-damaged human die?  These are questions of meaning.  Often everyone can agree on the facts (although not always), but because humans need meaning, these facts take on symbolic significance, and that meaning added to the fact must be argued for and made reasonable.  But the meaning that people place on objects or events derives from the third type of claim, a claim of value. 

25. Claims of value are the most difficult claims to make.  Why?  For many reasons.  They are contentious because every culture and sub-culture has its own set of values.  They also cannot be proven with any type of evidence.  Values do not exist in the objective world.  They are purely subjective phenomenon and exist only in the brains of human beings.  One cannot empirically point to justice or truth or beauty or goodness.  Dogs, ants, and monkeys have no concept of efficiency, honesty, or evil.  Humans have invented values to make our lives more meaningful and to increase social cooperation. 

26. But different individuals and different cultures have diverse sets of values, sometimes strikingly different.  And this is another reason value arguments are so difficult to make: It is impossible to make rival values even sensible to an opponent.  Take, for example, the practice of cannibalism.  Some people are cannibals; they eat other people.  This is a fact.  Now, what does this practice mean?  Well, it means different things in different cultures, and these different meanings are grounded in different values of human life.  Some cultures regard human life as the highest good; thus, they would condemn cannibalism as one of the highest evils because it requires one to not only kill, but also to disrespect the body of the dead person by eating it, rather than praying over it and burying it.  How would you even try arguing with cannibals in order to convince them that their common sense value is "bad," and that your rival values are "good"?  It’s almost impossible to conceive how such an argument could work.  Such a confrontation would most certainly end with aggression or violence – possibly followed by dinner, with you as the main dish!

27. Values come in two basic types.  First, there is the claim for "what is good," which would be a universal principle of goodness (which also entails its opposite, a universal principle of badness).  There are myriad examples of cultural “goods,” such as life, property, sexual virility, strength, beauty, efficiency, kindness, love, individualism, competitiveness, even death.  But principles are meant to be useful.  Humans create principles to be rules that guide our actions.  Thus, there is a second type of value claim: "What is the right thing to do?" 

28. A value argument for right action seeks to apply a claim for goodness to a particular situation in order to argue that a particular act should or should not be done.  If I am a cannibal and hold the value that life and death are equally good, especially the goodness of preserving my own life and killing my enemy, then this would lead me to look for enemies to fight, kill, and eat.  If I am a typical American and hold the value of life to be good for all people, which also entails the opposite that death is bad for all people, then this would lead me to preserve my own life and not bring death to others.  Further, my principles would lead me to condemn, lock up, and perhaps execute a cannibal as a criminal, although the practice of execution would violate the principle of life I supposedly hold dear because I believe that no death should be done to anyone, ostensibly even to cannibals who try to eat me. 

29. The above conflict between the values of life and justice is a perfect example of moral ambiguity.  This is where a real-life situation puts our moral principles to test and we have to critically think about the right action to take.  Sometimes, the right action for a particular situation violates our principles.  Sometimes there is no right action and we have to choose the lesser evil of several bad options.

30. While values are one of the most important aspects of human nature, they cause us many problems, especially in a globalized world filled with diverse cultures, each of which has different beliefs about what is good and right.  Values are an integral part of our common sense beliefs, but most people don't know why they hold the values they do.  Further, most people have no real understanding of why the values they hold are actually good or bad.  When we are young, we are told by parents, priests, or politicians to accept certain values, and they become part of the cultural air that we breathe. 

31. Thus, different cultures with different values often misunderstand each other, which leads to negative judgments and disrespect.  This situation, in turn, often results in disagreements, conflicts, and violence.  Because humans take their beliefs so seriously, disagreements over values have been one of the most common causes of murder and war.  A verbal argument over facts and the meaning of facts can often turn violent once it becomes clear that the participants hold different sets of underlying values that cannot be reconciled.  To illustrate, proponents of abortion believe in the supreme values of physical health for the mother and freedom of choice, while their opponents believe in the supreme values of the life of the fetus and the divine origin of souls.  These values cannot be reconciled; thus, these two groups often resort to screaming at each other, or worse, killing each other.  

32. But if in an age of globalization we hold the values of peace, tolerance, freedom, and cooperation, then arguing is the only constructive tool we have to resolve conflicts and come to some agreement about building a better world.  So, it is important to understand how human beings think and how arguments are constructed.  You need to be able to evaluate the thinking and arguments of others to decide if you agree with their claims and will agree to their proposals for action.  You also need to be able to evaluate your own thinking and the arguments you might make to move an audience. 

33. Different types of audiences will require different types of arguments and evidence, so you must always understand the rhetorical context of any argument to fully understand how and why someone is arguing a certain way.  Understanding the different rhetorical contexts will also help you understand how to judge the quality of the speaker's claims, evidence, and reasoning.  But there is no silver bullet when it comes to arguments.  Everyone makes errors, takes thinking shortcuts, and displays poor judgment (Kahneman, 2011; Popkin, 1994, p. 218).  In addition, we can never find the appropriate words or tone to reach everyone.  We will always be misunderstood by someone in our audience.  Arguments are always fragile and imperfect constructions.  However, if we hold dear the values of openness, freedom, and truth, then arguments are the only tool at our disposal to move audiences into collective action.  Without arguments to convince an audience, we would have to resort to the older political tools of deception, coercion, and violence.   


References & Further Reading


Abrams, M. H.  (1953).  The mirror and the lamp: Romantic theory and the critical tradition.  Oxford, UK: Oxford University Press.

And man made life.  (2010, May 22).  The Economist.  Retrieved Dec. 3, 2012, from

Akerlof, G. A., & Shiller, R. J.  (2015).  Phishing for phools: The economics of manipulation and deception.  Princeton: Princeton University Press.

Ariely, D.  (2008).  Predictably irrational: The hidden forces that shape our decisions.  New York: Harper Perennial.

Aristotle.  (1995).  Rhetoric.  In Jonathan Barnes (Ed.), The complete works of Aristotle: The revised Oxford translation, Vol 2.  (pp. 2152-2269).  Princeton, NJ: Princeton University Press.

Basken, P.  (2012, Oct 1).  Misconduct, not error, found behind most journal retractions.  The Chronicle of Higher Education.  Retrieved Dec. 3, 2012, from  

Beach, J. M.  (2012).  Kenneth Burke: A sociology of knowledge: Dramatism, ideology and rhetoric.  Austin, TX: West by Southwest Press.  

Berlin, I. (2000).  Historical inevitability, In The proper study of mankind.  New York: Farrar, Straus and Giroux.

Bernays, E. L.  (2011).  Crystallizing public opinion.  Brooklyn, NY: Ig Publishing.  (Original work published 1923)

Bernays, E. L.  (2005).  Propaganda.  Brooklyn, NY: Ig Publishing.  (Original work published 1928)

Bloor, D.  (1991).  Knowledge and social imagery.  2nd ed.  Chicago: University of Chicago Press.

Bly, R. W.  (2005).  The copywriter’s handbook: A step-by-step guide to writing copy that sells.  3rd ed.  New York: Owl Books.

Burke, K.  (1969).  A rhetoric of motives.  Berkeley, CA: University of California Press. (Original work published 1950)

Burke, K.  (1973).  The philosophy of literary form.  Berkeley, CA: University of California Press. (Original work published 1941)

Chang, K.  (2012, Sept 24).  Bias persists for women of science, a study finds.  The New York Times.  Retrieved Dec. 3, 2012, from

Cole, J. R.  (2009).  The great American university: Its rise to preeminence, its indispensable national role, and why it must be protected.  New York: Public Affairs.

Crawford, M. B.  (2009).  Shop class as soulcraft: An inquiry into the value of work.  New York: Penguin.

D'Andrade, R.  (2002).  Cultural Darwinism and language.  American Anthropologist 104(1): 223-232.

Deeds, not words.  (2012, Sept 15).  The Economist.  Retrieved Dec. 3, 2012, from

Demos, J.  (2008).  The enemy within: 2,000 years of witch-hunting in the western world.  New York: Viking.

Dennett, D. C.  (2003).  Freedom evolves.  New York: Viking.

Deutsch, D.  (1997). The fabric of reality: The science of parallel universes - and its implications.  New York: Allen Lane.

Diamond, J., & Robinson, J. A. (Eds.).  (2010).  Natural experiments of history.  Cambridge, MA: Harvard University Press.

Dunning, D.  (2014, Oct 27).  We are all confident idiots.  Pacific Standard: The Science of Society.  Retrieved Nov 2 2014 from

Eagleton, T.  (1991).  Ideology: An introduction.  London: Verso.

Emerson, R. W. (1957).  Experience.  In Selections from Ralph Waldo Emerson.  Boston: Houghton Mifflin.  (Original work published 1844)

Ewen, S.  (1996).  PR! A social history of spin.  New York: Basic Books.

Experimental psychology: The roar of the crowd.  (2012, May 26).  The Economist. Retrieved Dec. 3, 2012, from

Feyerabend, P.  (2010).  Against method.  4th ed.  London: Verso.  (Original work published 1975)

Finkbeiner, A.  (2006).  The Jasons: The secret history of science's postwar elite.  New York: Viking.

Flanagan, O.  (2007).  The really hard problem: Meaning in a material world.  Cambridge, MA: MIT Press.

Flanagan, O.  (2011).  The Bodhisattva's brain: Buddhism naturalized.  Cambridge, MA: MIT Press.

Frank, T.  (2000).  One market under God: Extreme capitalism, market populism, and the end of economic democracy.  New York: Anchor Books.

Frankfurt, H. G. (2005).  On bullshit.  Princeton: Princeton University Press.

Freedman, D. H.  (2010, Nov).  Lies, damned lies, and medical science.  The Atlantic.  Retrieved Dec. 3, 2012, from

Galbraith, J. K.  (2001).  The concept of the conventional wisdom.  In The essential Galbraith.  New York: Mariner Books.

Galileo.  (1957).  The assayer.  In Discoveries and Opinions of Galileo (pp. 217-280).  New York: Anchor Books.  (Original work published 1623)

Gaukroger, S.  (2001).  Francis Bacon and the transformation of early-modern philosophy.  Cambridge, UK: Cambridge University Press.

Gay, P.  (1995).  The enlightenment: The rise of modern paganism. New York: W. W. Norton.

Geertz, C.  (2000).  Ideology as a cultural system.  In The interpretation of cultures.  New York: Basic Books.  (Original work published 1973) 

Geertz, C.  (2000).  Common sense as a cultural system.  In Local Knowledge.  New York: Basic Books.  (Original work published 1983)

Gray, J.  (1995).  Enlightenment's Wake.  London: Routledge.

Greene, J. D.  (2002).  The terrible, horrible, no good, very bad truth about morality and what to do about it.  Unpublished doctoral dissertation, Princeton University, Princeton.

Gross, C. (2012, Jan 9/16).  Disgrace.  The Nation.  Retrieved Dec. 3, 2012, from

Hammersley, M., & Atkinson, P.  (2003).  Ethnography: Principles in practice.  2nd ed.  London: Routledge.  

Huff, D. (1993).  How to lie with statistics.  New York: WW Norton & Company.

Hume, D. (1888).  Treatise of human nature.  Oxford, UK: Clarendon Press.  (Original work published 1739)

Igo, S. E.  (2007).  The averaged American: Surveys, citizens, and the making of a mass public.  Cambridge, MA: Harvard University Press.

Isaacson, W.  (2007).  Einstein: His life and universe.  New York: Simon & Schuster.

Jacoby, S.  (2009).  The age of American unreason.  Revised Edition.  New York: Vintage.

Journalistic deficit disorder.  (2012, Sept 22).  The Economist.  Retrieved Dec. 3, 2012, from

Judson, H. F.  (2004).  The great betrayal: Fraud in science.  New York: Harcourt.

Kahneman, D.  (2011).  Thinking, fast and slow.  New York: Farrar, Straus and Giroux.

Kant, I.  (1994).  Critique of pure reason.  London: Everyman's Library. (Original work published 1781)

Kihlstrom, J. F.  (2013, Spring).  Threats to reason in moral judgment.  The Hedgehog Review, 15(1), 8-18.

Kirsch, I.  (2010).  The emperor's new drugs: Exploding the antidepressant myth. New York: Basic Books.

Klee, R.  (1999). Introduction.  In R. Klee (Ed.), Scientific inquiry: Readings in the philosophy of science (pp. 1-4)Oxford: Oxford University Press.

Klein, J.  (2003).  Francis Bacon.  Stanford Encyclopedia of Philosophy.  Retrieved Dec. 3, 2012, from entries/francis-bacon/

Kuhn, T.  (1996).  The structure of scientific revolutions. 3rd ed.  Chicago: University of Chicago Press.

Levitt, S. D., & Dubner, S. J.  (2009).  Freakonomics: A rogue economist explores the hidden side of everything.  New York: Harper Perennial.

Lindblom, C. E.  (1990).  Inquiry and change: The troubled attempt to understand and shape society.  New Haven: Yale University Press.

Lindblom, C. E., & Cohen, D. K.  (1979). Usable knowledge: Social science and social problem solving.  New Haven: Yale University Press.

Lindley, D.  (2008).  Uncertainty: Einstein, Heisenberg, Bohr, and the struggle for the soul of science.  New York: Anchor Books.

Lindstrom, M.  (2011).  Brandwashed: Tricks companies use to manipulate our minds and persuade us to buy.  New York: Crown.

Lindstrom, M.  (2010).  Buyology: Truth and lies about why we buy.  New York: Crown.

Lippmann, W.  (1997).  Public opinion.  New York: Free Press.  (Original work published 1922)

Malkiel, B. G.  (2012).  A random walk down wall street: The time-tested strategy for successful investing.  New York: W. W. Norton & Company.

Mayr, E. (1997).  This is biology: The science of the living world.  Cambridge, MA: Harvard University Press.

Mearsheimer, J. J. (2011).  Why leaders lie: The truth about lying in international politics.  Oxford, UK: Oxford University Press.

Morgan, E. S.  (1988).  Inventing the people: The rise of popular sovereignty in England and America.  New York: W. W. Norton.

Moss, M.  (2013, Feb 20).  The extraordinary science of addictive junk food.  The New York Times, Retrieved Feb 21 from

Norman Borlaug.  (2009, 19 Sept).  The Economist. Retrieved Dec. 3, 2012, from

Nussbaum, M. C. (1997).  Cultivating humanity: A classical defense of reform in liberal education.  Cambridge, MA: Harvard University Press.

Oreskes, N., & Conway, E. M.  (2010).  Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming.  New York: Bloomsbury Press.

Packard, V.  (2007).  The hidden persuaders.  Brooklyn, NY: Ig Publishing.  (Original work published 1957)

Pinker, S.  (1997).  How the mind works.  New York: W. W. Norton.

Pinker, S.  (2002).  The blank slate: The modern denial of human nature. New York: Penguin.

Plato.  (1997).  Republic.  In J. M. Cooper (Ed.), Plato: Complete works (pp. 971-1223).  Indianapolis, IN: Hackett Publishing.

Polanyi, M.  (1962).  Personal knowledge: Towards a post-critical philosophy.  Chicago, IL: University of Chicago Press.

Polanyi, M.  (1964).  Science, faith and society: A searching examination of the meaning and nature of scientific inquiry.  Chicago, IL: University of Chicago Press.

Popkin, S. L.  (1994).  The reasoning voter: Communication and persuasion in presidential campaigns.  2nd ed.  Chicago, IL: University of Chicago Press.

Popper, K.  (2002). The logic of scientific discovery.  London: Routledge.  (Original work published 1959)

Popper, K.  (1979).  Objective knowledge: An evolutionary approach.  Revised edition. Oxford: Oxford University Press.

Rorty, R.  (1979).  Philosophy and the mirror of nature.  Princeton, NJ: Princeton University Press. 

Sachs, J. D.  (2011).  The price of civilization: Reawakening American virtue and prosperity.  New York: Random House.

Sagan, K.  (1996).  The demon-haunted world: Science as a candle in the dark. New York: Random House.

Sen, A. (2009).  The idea of justice.  Cambridge, MA: Harvard University Press.

Shapin, S.  (2010). Never pure: Historical studies of science as if it was produced by people with bodies, situated in time, space, culture, and society, and struggling for credibility and authority.  Baltimore, MD: Johns Hopkins University Press.

Shenkman, R.  (2008).  Just how stupid are we? Facing the truth about the American voter.  New York: Basic Books.

Solow, R. M.  (1997).  How did economics get that way and what way did it get?  In T. Bender & C. E. Schorske (Eds.), American academic culture in transformation (pp. 57-76). Princeton, NJ: Princeton University Press.

Taubes, G.  (2007).  Good calories, bad calories.  New York: Anchor. 

Thaler, R. H.  (2015).  Misbehaving: The making of behavioral economics.  New York, W. W. Norton.

Thaler, R. H., & Sunstein, C. R. (2008).  Nudge: Improving decisions about health, wealth, and happiness.  New Haven: Yale University Press.  

The death of facts in an age of truthiness.  (2012, April 29).  National Public Radio.  Retrieved Dec. 3, 2012, from 

Toulmin, S.  (1958).  The uses of argument.  Cambridge, UK: Cambridge University Press.

Toulmin, S. (1961).  Foresight and understanding: An inquiry into the aims of science.  New York: Harper.

Toulmin, S.  (2001).  Return to reason.  Cambridge, MA: Harvard University Press.

Toulmin, S., Rieke, R., & Janik, A.  (1979).  An introduction to reasoning.  New York: Macmillan.

Tye, L.  (1998).  The father of spin: Edward L. Bernays & the birth of public relations.  New York: Crown.

Watters, E.  (2013, March/April).  We aren’t the world.  Pacific Standard, 46-53.

Wheelan, C.  (2013).  Naked statistics: Stripping the dread from the data.  New York: W. W. Norton.

Zimmer, C.  (2012, April 16).  A sharp rise in retractions prompts calls for reform.  The New York Times.  Retrieved     Dec. 3, 2012, from



To cite this chapter in a reference page using APA:

Beach, J. M.  (2013).  Title of chapter.  In 21st century literacy: Constructing & debating knowledge.  Retrieved date from


To cite this chapter in an in-text citation using APA:

(Beach, 2013, ch 11, para. #).



© J. M. Beach 2013, Revised 2016