Chapter 5

the search for objective knowledge:  Critically Evaluating Science


1. We currently live in an age of "Truthiness" where there seems to be no empirical facts and no objective world – only endless, conflicting opinions (“The Death of Facts,” 2012).  When facts are invoked in debates over public policy, many political groups try to fight facts with what Oreskes and Conway (2010) call the "Tobacco Strategy" (pp. 5-6).  Unscrupulous political groups manipulate ignorant audiences through systematic, relentless, and well-funded "doubt-mongering" (pp. 5-6, 16, 18, 34).  Partisan media and think tanks reduce reporting to an "echo chamber" of politicized talking points and manufactured lies to manipulate the public (Oreskes & Conway, 2010, pp. 7, 236).  Success in politics and law seems to require cynicism, as the indistinct line between subjective and objective reality is mystified behind obfuscated doublespeak, manipulated spin, and the "cacophony of conflicting claims" (Oreskes & Conway, 2010, p. 241).  Most of us already know that social, political, and business leaders engage in various acts of official deception, which often take the form of outright lying (Mearsheimer, 2011; Sachs, 2011, p. 24).

2. The philosopher Isaiah Berlin (2000) noted, “Because there is no hard and fast line between 'subjective’ and ‘objective’, it does not follow that there is no line at all” (p. 170).  The unscrupulous might try to erase this line out of ignorance, deceit, desire for power or profit, but the line remains.  The truth exists because the objective world exists.  However, knowing and understanding that objective world has proven much harder than anyone expected.  The world is big and complex, but we can directly experience only a small part of it.  Even when honest, academically trained professionals try to understand the objective world, philosopher Amartya Sen (2009) has pointed out, there is no "guarantee of reaching truth" (p. 40).  Even "the most rigorous of searches," Sen explains, "could still fail" (p. 40).  Objective truth is out there, but it is very hard – and sometimes impossible – to reach.

3. We all see the objective world every day, but very few of us actually look past our subjectivity or culture to know the objective world.  Stuck in a highly local context, most people dwell within their subjectivity and culture and rarely venture outside of it.  People believe what they subjectively see, which is often called common sense.  Most people believe they know the objective world, but what they really know is their own unique subjective experience of that larger objective world.  And they only really know about what they directly experience in their local context, which is an infinitesimally small part of the complex universe.  But while we are separated from the objective world by our consciousness, we are still connected to and a part of that objective world.  Our brain and our bodies are physical parts of the objective world.  Our subjectivity is naturally attuned to an objective world that is knowable, largely because our mind is part of and has evolved within this objective world.  Thus, we can understand the objective world fairly well.  But do we really see and know the objective world as it actually exists? 


Handout:  The Ecology of Knowledge


4. For most of us, the answer is a qualified “yes.”  We see the surface appearance of the immediate objective world, what the ancient Greeks called doxa.  This limited form of knowledge creates a sense of practical realism that helps us make day-to-day decisions. 

5. Our culture also produces general beliefs about the world called common sense.  The ancient Greeks called this orthodoxia, or the body of official and unquestioned beliefs shared by a culture.  But our common-sense knowledge is always highly limited because we can empirically validate only what we directly experience on a daily basis in our local environment - all the rest we have to trust based on the authority of cultural institutions or powerful people.  However, even our senses can fail us.  When we directly perceive our local world, we usually see only the superficial surface of objective reality.  Rarely do we fully understand the depths of the objective world that surrounds us.  Thus, while subjectivity (doxa) and culture (orthodoxia) are forms of knowledge, they are very limited forms of knowledge that often tell us more about ourselves and our culture than about the objective world in which we live.


Handout:  Common Sense vs. Science

Handout: How Do You Know? The Seven Types of Knowledge


5.1      Science Is a "Technology of Truth"

6. The most reliable information comes from scientists or professionals trained in scientific methods, but even science is not a perfect form of knowledge.  And what's worse, science often becomes a form of magic for the average person because most people don't understand why scientific conclusions are better than common sense.  Thus, many people either trust science as a cultural authority, or they reject science as just another biased cultural belief.  Many, if not most, people do not understand why science produces the most valid forms of truth (Sagan, 1996; Jacoby, 2009, p. 211).  Even practicing scientists can't always agree on what they do or how it works.  The practice of science is an important human endeavor that can get us very close to objective reality, but it is not a perfect tool leading to absolute certainties.  The promise of science is not in the conclusions it reaches, nor in the technology it produces.  Instead, the promise of science lies in its unique process of knowing the objective world.  In order to know more about the objective world so that we can make more informed decisions, we first need to understand how science works and what its limitations are.  

7. What is called science is actually a bunch of different research disciplines (see chapter 4) that are unified by one basic method, the scientific method.  However, this basic scientific method is actually adapted by each discipline in a particular way.  It has been broken up into a diverse set of procedures that can be used to explore the objective world in different ways.  Some believe that scientific activity is too diverse to talk about as a single, distinct phenomenon.  To make matters worse, the endeavor of science is a "continually evolving" set of ideas and techniques, so it is never stable enough to pin down exactly (Toulmin, 1961, p. 109).  But it is possible to generalize a basic process and purpose of science.  These similarities account for the significance of science as a general knowledge-creating tool. These similarities constitute the basic foundation of all scientific practice common to all scientists.  As the biologist Ernst Mayr (1997) pointed out, "One would not be able to speak of science in the singular if not all sciences, in spite of their unique features and a certain amount of autonomy, did not share common features" (p. 33).

8. Physical and social scientists carefully create theories, methods, and technologies.  These tools enable greater description, explanation, and sometimes prediction of the objective world. The knowledge scientists create allows for some measure of control over ourselves, our society, and our environment.  Science is the practice of disciplined reflection, theory, experiment, and critical debate.   Science is “a technology of truth,” as Daniel Dennett (2003, p. 6) described it.  Scientists filter out their subjectivity by using precise research methods.  They use theories to experiment on the objective world, gather evidence, draw conclusions from the evidence, and then present their conclusions to a critical community so that others can analyze these conclusions and try to prove them wrong.  The end result of this process is the creation of provisional truths.  The scientific process is based on a belief that empirical observations and laboratory experiments produce a knowledge that "corresponds" with reality (Bloor, 1991, pp. 40-45).


Handout:  The Scientific Method


9. Often scientists are trying to investigate the deeper layers of reality that we cannot directly see without specialized methods and equipment (Deutsche, 1997, pp. 3, 7).  Many scientists believe that they can use the scientific method to not only describe the objective world, but also to explain "the fabric of reality itself" (p. 3).  Once scientists gain an insight into the objective world, they use their knowledge to create better technology, a term which refers to new tools that help improve our society and our lives.  Scientists also use their knowledge to try to predict the future.  By anticipating possible causes and effects, researchers try to solve the important social, economic, political, and ecological problems that threaten the stability and sustainability of our societies and our planet.

10. The first step of the scientific process is the literature review.  Scientists explore not only a specific topic, but they enter into an academic conversation about that topic, a debate that has been happening for years, decades, and sometimes centuries.  Scientists need to understand what is known and unknown about a topic.  They also need to understand how the current knowledge was created: what tools did previous scientists use and how well did the various tools work?  Once a scientist figures out the best tools to use, then they plan out a research project in order to re-test what is known or to discover some new knowledge.

11. A scientist starts with a theory, which is a model explaining how some part of the objective world works.  A theory will not only explain all the parts, but it will explain how all the parts work and fit together into a structured whole.  Philosopher, and motorcycle mechanic, Matthew B. Crawford (2009) explained how he once could not understand the workings of a motorcycle engine because he lacked a theoretical model to explain how an engine was supposed to work: “A more experienced mechanic has pointed out to me something that was right in front of my face, but which I lacked the knowledge to see.  It is an uncanny experience; the raw sensual data reaching my eye before and after are the same, but without the pertinent framework of meaning, the features in question are invisible” (p. 91).  A theory is a model of an object, interaction, or process, which provides a “framework of meaning,” as Crawford (2009) pointed out.  The theory makes the data or evidence meaningful by explaining how the evidence fits together into a larger whole.

12. Once a scientist has chosen a specific theory, she will use this model to create several hypotheses, which are clear statements that can be proven true or false.  Then the scientist will use specific tools called research methods to discover and collect data, which will be evidence used to prove the hypothesis true or false.  There are thousands of specific research methods to collect different kinds of data, depending on the academic discipline a scientist works within and the topic being studied.  We will discuss some basic types of research methods and evidence later in this chapter (ch 6.5).  Once enough data has been collected, a scientist will use another kind of tool, analytical methods, which are used to make the data meaningful by organizing the data into theoretical categories and connecting the data according to the theoretical model. 

13. While the individual research of each scientist is important, the combined effect of the community of scientists and all their research is much more important.  This scientific community is developed and fostered by scientific institutions, like university academic departments, conferences, and peer reviewed journals.  These institutions form the social foundation of science, which is a group activity.  After an individual scientist has finished his or her research and proven a hypothesis true or false, then that scientist shares these findings to the community of scientists.  This is the last step of the scientific process, and the most important part of the scientific method.  This last step is a social process called "peer review."  The purpose of this general process is to critically analyze the theories, methods, data, and conclusions of scientific research in order to look for errors and false conclusions. 


Scientific Peer Review: Meet the "Data Thugs"


14. Each scientist proposes theories about reality based on research and then submits a finished report to a peer-reviewed journal.  There are over 50,000 such scientific journals around the world (Judson, 2004, p. 276).  Reviewers for these journals criticize research papers and decide if these papers should be published.  Once a paper is published in a journal, its findings are then evaluated, judged, and later refined through the disciplined debate of the larger scientific community (Popper, 1959/2002; Judson, 2004).  Oreskes and Conway (2010) argue that peer review "is what makes science scientific claim can be considered legitimate until it has undergone critical scrutiny by other experts" (p. 154). 

15. But critical debate is not enough.  Scientific truths are also evaluated with further tests designed deliberately to falsify claims.  If the scientific claim cannot be falsified by such tests, then it still remains provisionally “true,” although now accepted as more true than before (Popper, 1959/2002; Mayr, 1997, pp. 47, 51).  Most scientific claims are falsified in small ways and modified over the years so as to become more true, but rarely, if ever, completely true.  A fact is a theory that has been "repeatedly confirmed and never refuted" (Mayr, 1997, p. 61). But even theories that don't reach this final stage are still useful tools that help explain how reality might work once the evidence is found to back them up (Mayr, 1997).  As David Deutsch (1997) explains, "In science we take it for granted that even our best theories are bound to be imperfect and problematic in some ways, and we expect them to be superseded in due course by deeper, more accurate theories" (p. 17).  

16. Over the past two centuries, the practice of science has improved the lives of billions of people through increased knowledge and technology, which have increased the health of individuals and the wealth of societies.  To take but one notable example, the agricultural scientist Norman Borlaug won the Nobel Peace Prize in 1970 for developing disease resistant and high-yield food crops in the 1950s and 1960s, which led to the Green Revolution.  This scientific improvement was a major breakthrough for the human species.  It kept hundreds of millions in the developing world from starving, and it stabilized and sustained the economic market for food (Norman, 2009).  Nobody can honestly deny the possibility of objective knowledge and the benefits that science and technology can provide. 


VIDEO: "Me & Isaac Newton"


5.2      Provisional Truths: On the Limits of Science

17. The practice of science is perhaps the greatest human innovation of all time.  However, science and technology are not unqualified goods.  Science is not perfect.  It is only the best method we have to create truth and the technology upon which we increasingly depend.   But as the practice of science advances and pushes down more and more boundaries, we must collectively acknowledge that "science does have the potential to do great harm, as well as good" (“And Man”, 2010, para 5).  Nuclear and biological weapons, as well as environmental pollution, are all scientifically produced evils that threaten life on earth.  As Dr. John Ioannidis has stated, "The scientific enterprise is probably the most fantastic achievement in human history, but that doesn't mean we have a right to overstate what we're accomplishing" (as cited in Freedman, 2010, p. 86).  Thus, as the philosopher of science Robert Klee (1999) has argued, while science's "track record of achievements is neither errorless nor continuous," the strength of the scientific process has been continuously proven by the ability of scientists to "learn more from mistakes than from successes" (pp. 2-3). With this in mind, it is instructive to demonstrate some of the limitations of science as it is currently practiced so that we can evaluate the validity of scientific research more carefully and accurately.  There are many serious flaws in the practice of science, including limits to what scientists can and cannot accomplish, and the unintended consequences of scientific discoveries. 

18. While science helps us better understand the objective world, scientists themselves are not immune to subjectivity and culture.  Scientists can be just as irrational and biased as the rest of us (Judson, 2004, p. 148).  Physicist Freeman Dyson once humbly acknowledged, "We are scientists second and human beings first" (as cited in Finkbeiner, 2006, p. xxx).  We must never trust scientists merely because of their social status as scientists because these knowledgeable people are still human and make mistakes.  For example, many scientists, both men and women, have a strong bias against female students, believing women are less intelligent than men when it comes to practicing science (Chang, 2012).  Chang (2012) explained that this bias "probably reflected subconscious cultural influences rather than overt or deliberate discrimination" (para. 3).  Scientists also routinely make mistakes (Kahneman, 2011, p. 8), just like the rest of us, and they don't like to admit when they are wrong.  Some scientists don't question their assumptions before investigations.  Many cling to traditional scientific theories like religious "dogma" (Gray, 1995, p. 231; Kahneman, 2011, p. 9).  The widespread practice of holding on to discredited theories led philosopher of science Thomas Kuhn (1996) to assert that powerful scientific paradigms resist change, even when new evidence has proven them wrong.  One scientist acknowledged, "Even when the evidence shows that a particular research idea is wrong, if you have thousands of scientists who have invested their careers in it, they'll continue to publish papers on it" (as cited in Freedman, 2010, p. 84). 

19. Some scientists even lie and cheat, which is not science at all.  Such deceptive practices have been called pseudo-science, junk science (Jacoby, 2009, p. 210), voodoo science (Kirsch, 2010, p. 53), or just fraud (Judson, 2004).  These pseudo-scientists may willfully distort data or plagiarize the ideas of others, which of course, is a "violation" of the principles of science (Goodstein, 2010, p. 1).  Even respected scientists, doctors, and academics can cross the line into pseudoscience, especially if they are speaking through a news media outlet like a television show.  Dr. Oz is a respected Columbia University heart surgeon, but he has been accused of peddling false and misleading information. Charles Gross (2012) recounts several studies of scientific misconduct where roughly 7 to 27 percent of scientists (depending on the study) reported firsthand knowledge of "fabricated, falsified or plagiarized research over the previous ten years" (p. 26).  Another study claims that 67 percent of journal article retractions are due to ethical "misconduct" rather than scientific "errors" (Basken, 2012, para. 11).  And there is evidence that scientific misconduct has been rising in recent years, due to increased pressure to publish more research so that scientists can get better jobs, earn more research money, and gain more recognition (Zimmer, 2012). Because of such misconduct, there are a large number of retractions each year.

20. But even when there isn't a willful distortion of the data, there is still a lurking subjective tendency to acknowledge only the data that corroborates one's theory.  This type of bias is called "cooking" the data (Goodstein, 2010, p. 33).  It is very widespread, even in published research.  Cooking the data can also come in another form.  It’s called a "publication bias."  This type of bias occurs when research journals accept and legitimate only certain types of claims or theories, which they favor, while ignoring and refusing to publish other types research that may be unfashionable or controversial (Kirsch, 2010, p. 25).

21. Both voodoo science and the plain old variety of bad science perpetuate themselves for many reasons: the prestige of powerful scientists; the inherent bias of funding sources, which can often create conflicts of interest; and the subjective "black box" of the peer review process itself (Judson, 2004).  The scientific publishing industry has come under heavy scrutiny over the past decade.  It is well known that funding sources often bias research results.  For instance, pharmaceutical companies tend to publish only those data that support their drugs and ignore unfavorable data (Kirsch, 2010).  But less well known is the inherent bias within the peer review system itself (Judson, 2004, ch 6).  Many articles are published not solely on their merit, but because of contingent factors, such as the subjective interests of reviewers or the professional connections of the writer.  Several scientists have argued that the peer review system "rewards conformity and excludes criticism; " thus, popular consensus can often lead to mindless replication of the same conclusion (Taubes, 2007, pp. 52). 

22. Scientists are also not critically analyzing each other enough through peer review.  Even when many scientists are all publishing on the same topic, they are mostly focused on their own research (and reputation), and rarely do they do the important work of retesting each others' findings to replicate results (Freedman, 2010; Gross, 2012).  A lot of errors go undetected by the scientific community because few want to do the hard and thankless work of peer review.  In fact, Dr. John Ioannidis’ meta-analysis of medical research has found that 80 percent of non-randomized studies offer false conclusions.  He also found that around 41 percent of the most cited medical research from the most prestigious journals turns out to be "wrong or significantly exaggerated" (as cited in Freedman, 2010, pp. 80-81).  Instead of doing peer review, scientists want to design their own study to attract professional acclaim and research dollars.  No one gets famous for criticizing the work of others or proving theories wrong.  As physicist David Goodstein (2010) has argued, "It's far better to prove a theory is right than to prove it is wrong" (p. 132). 

23. But even good science filtered through peer review is rife with problems.  The practice of science is expensive, complex, and time consuming.  Studies ideally last years, if not decades.  This length of time creates a backlog of untested theories and a long delay before any conclusions are reached.  And because scientific studies are so complex and time intensive, they cost a lot of money.  The best studies are usually the longest and largest studies, and, of course, they cost the most.  The National Institutes of Health and the National Science Foundation are the primary sources of government funding of science in the United States.  Together they invest tens of billions of dollars every year, and the vast majority of applications (almost all quality projects) are turned down for funding.  As one economist put it, "Theory is cheap, and data are expensive" (Solow, 1997, p. 75). 

24. Good scientists are also overly influenced by research traditions.  Young scientists will generally use the theories and methods of their professors from graduate school, who also practice the same methods they had previously learned in school.  Hence, the cliché: a "school of thought."  These traditions lead to a perpetuation of older, established methods and theories, and a neglect of newer ones, even though the new methods and theories might be better.  Currently, the positivist model of empirical physical science has been held up as the only true or valid model for all knowledge claims.  This theoretical model has led to the denigration or dismissal of diverse scientific practices, especially in the social sciences and humanities (Cole, 2009, pp. 100, 151-155). 

25. The practice of science is also narrowly specialized, which creates a unique problem at the heart of the scientific process.  Many scientists struggle to keep up with the vast proliferation of research in their own narrow niche and have to rely on faith in the ability of other scientists working in different fields.  As Michael Polanyi (1962) explained, "Nobody knows more than a tiny fragment of science well enough to judge its validity and value at firsthand.  For the rest [the average person] has to rely on views accepted at secondhand on the authority of a community of people accredited as scientists" (pp. 163, 216).  Professional scientists have been trained to research the objective world in a very narrow domain, and outside of this domain, scientists are often just as subjective and ignorant as the rest of us. 

26. Scientists must also have faith that all the "fragments of evidence" within their own narrow domain and across the curriculum add up to some significant advance in the whole of knowledge (Taubes, 2007, p. xxii).  Most of the truths scientists produce are relevant only to other specialized scientists.  Thus, whatever progress happens is far removed from the daily lives of most human beings.  Scientific knowledge only occasionally pays "practical human dividends" in terms of solving real world problems (Toulmin, 2001, p. 79).  Most of the time, scientific knowledge cannot be used by the average person.  But it could be potentially useful to specially trained experts one day.

27. Another problem is the overly simplified outcome of scientific studies, which often focus on a handful of discrete variables in a causal chain (Wheelan, 2013, p.2).  Thus, scientific studies either ignore or try to control for various complex systems surrounding the phenomenon being studied, which gives a false picture of how variables actually interact within densely layered and overlapping environmental levels: sub-atomic, atomic, chemical, biological, social, and ecological.  Because scientists often ignore or overly simplify the larger, contextual ecology of the real world, they aren't aware of the significant bias in their data gathering methods. 

28. For example, in the field of psychology, the traditional methods of data gathering have been so biased as to call into question the validity of the entire discipline.  Only recently did psychologists realize that the majority of their test subjects were "WEIRD" ("Experimental," 2012; Watters, 2013).  This acronym stands for Western, Educated, Industrialized, Rich, and Democratic people. For the past half century, most academic psychologists had been studying undergraduate volunteers, primarily from the United States, and making the assumption that these young American college students represented the whole of humanity.  From 2003 to 2007, approximately 96 percent of research in psychology used WEIRD test subjects (Watters, 2013, p. 49).  But Joseph Henrich and his colleagues have pointed out that these Western test subjects, who represent approximately 12 percent of the world population, have a very unique culture, which leads to very unique thought patterns and cultural beliefs.  Thus, scientific conclusions based on Western test subjects will not be valid if these conclusions are used to understand other, non-Western cultures.  Because of the extreme uniqueness of American culture, as Watters (2013) reports, researchers have “concluded that social scientists could not possibly have picked a worse population from which to draw broad generalizations” about humanity as a whole (p. 50).  Scientists need to be more aware of the larger, social and political contexts surrounding their research.

29.  Scientific knowledge does not exist in a vacuum.  Not only do scientists need to be aware of larger cultural contexts in order to set up valid experiments, but scientists also need to be aware of how their culture might interpret or use their research.  Valid scientific research can be misinterpreted and/or misused by public, especially by politicians and journalists.  During World War II, the political leaders of both Germany and America controlled the field of nuclear physics in order to use scientific research to develop weapons of mass destruction (Finkbeiner, 2006, p. 7).  American scientists during the Vietnam War helped the government create new weapons, which escalated the war and killed many civilians (Finkbeiner, 2006, p. 113).  Physicist Robert Oppenheimer helped build the first atomic bomb, which was later dropped on hundreds of thousands of innocent civilians in Japan.  He explained, "When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success" (as cited in Finkbeiner, 2006, p. 42).  However, there is a problem with this widespread attitude.  Once scientists invent potentially destructive technology, it’s hard to put the genie back in the bottle and keep it from harming human life or the environment.  Many American scientists were shocked and horrified by the dropping of the atomic bomb, which they helped to create, and yet, they bear some responsibility for the devastation it released.

30. All scientific studies have to be translated by journalists in the popular press for a mass audience, but much of this reporting turns about to be wrong or exaggerated. Untrained in scientific methods, many reporters and public intellectuals mistake pseudo-science or voodoo science for the real thing.  Thus, they mislead the public with false information (Jacoby, 2009, p. 63).   Other reporters who do have training in science often focus on eye-catching research and write about early groundbreaking results, but these reporters rarely, if ever, follow up on the peer review process that exposes the many mistakes scientists make.  When it comes to science writing in the popular press, the public is often misled, rarely getting refined scientific truth ("Journalistic," 2012). 

31. Scientists themselves can also mislead the public by oversimplifying complex research.  Take, for example, the medical research and public health policy surrounding nutrition and diet.  Gary Taubes (2007) has convincingly argued that "nutritionists for a half century oversimplified the science to the point of generating false ideas and erroneous deductions" (p. 152).  He concluded,


Practical considerations of what is too loosely defined as the "public health" have consistently been allowed to take precedence over the dispassionate, critical evaluation of evidence and the rigorous and meticulous experimentation that are required to establish reliable knowledge.  The urge to simplify a complex scientific situation so that physicians can apply it and their patients and the public embrace it has taken precedence over the scientific obligation of presenting the evidence with relentless honesty (pp. 152, 451).


Scientists, like all people who claim expertise, can also become overconfident in their knowledge and abilities, and thereby, be more prone to make mistakes.  In one study of the flawed judgment of experts, a researcher found that "people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options" (Kahneman, 2011, p. 219).  There is also evidence that scientists themselves can deliberately obscure the truth for political purposes, ranging from hiding the dangers of smoking to denying the reality of global warming (Oreskes & Conway, 2010).  

32. Because of all these flaws, the social status of science, the judgments of scientists, and the products of science should not be venerated as perfect knowledge.  Scientific knowledge is never perfect.  Science "is not, and cannot be, as authoritative" as most scientists wish it to be (Lindblom & Cohen, 1979, p. 40).  Too often we believe scientists "because of their visible display of the emblems of recognized expertise and because their claims are vouched for by other experts we do not know" (Shapin, 2010, p. 88), elevating scientific claims to near divine, dogmatic status. The flaws of science are significant problems because scientists often situate their work or themselves within larger political debates.  Scientific expertise often informs political policy, which affects the public.  Laws based on flawed science (or bad information in general) can lead to serious adverse consequences for the whole society.   

33. Thus, constant vigilance and criticism are warranted.  In science, both honest mistakes and even outright fraud should be expected, as part of the "sloppy" nature of scientific research (Feyerabend, 2010/1975, p. 160).  As Susan Jacoby (2009) has pointed out, you should never trust anyone just because they use “scientific-sounding language” (p. 221).  You always need to critically analyze and verify the methodology, evidence, and reasoning behind every conclusion.  But even when the larger scientific community examines and verifies objective truth over time, it is important to remember that scientifically produced “truth” is never static or absolute.  There is no room for complacency or orthodoxy. Scientifically produced truth is only a provisional and "probable,” as philosopher of science Hans Reichenbach argued, "whose unattainable upper and lower limits are truth and falsity" (as cited in Popper, 2002/1959, p. 6). 

34. Science is built on the foundation of constant critique and revision of old, probable truths so as to continually create better truths or new truths.  The sociologist Max Weber argued, “In science, each of us knows that what he has accomplished will be antiquated in ten, twenty, fifty years.  That is the fate to which science is subjected; it is the very meaning of scientific work…We cannot work without hoping that others will advance further than we have” (as cited in Judson, 2004, p. 30).  The whole notion of scientific truth is always in perpetual flux, albeit often within relative limits based on large bodies of data (Mayr, 1997, p. 77). 

35. But even with all these flaws, the practice of science is still one of the most significant and useful human technologies ever conceived.  The philosopher of science Karl Popper went so far as to claim, "Next to music and art, science is the greatest, most beautiful and most enlightening achievement of the human spirit" (as cited in Mayr, 1997, p. 41).  And without fully understanding the practice of science, most people benefit from its practice, using the products of science to live better, know better, and communicate more clearly.  We all benefit from the laborious practice of scientists, even if we do not behave scientifically or do not completely understand scientific processes or outcomes. 

36. Science is an important invention, and it will continue to be a worthy endeavor, but this "technology of truth" (Dennett, 2003, p. 6) does not eliminate the problems of subjectivity and culture.  Science needs to be understood as an important, but limited tool.  We should not blindly trust science.  As David Lindley (2008) recognized, "Scientific knowledge, like our general, informal understanding of the everyday world we inhabit, can be both rational and accidental, purposeful and contingent.  Scientific truth is powerful, but not all-powerful" (p. 216).  Karl Popper (1979) concluded, "There is no absolute certainty...The quest for certainty, for a secure basis of knowledge, has to be abandoned" (p. 37).  At root, the indeterminacy of science is related to the foundational flaw of all human knowledge: We still have to subjectively interpret the objective world (Toulmin, 1961, p. 81).  Therefore, in a world without perfect and direct knowledge of the objective world, we must find the best methods to improve the quality of our knowledge.  But regardless of what method is used, all techniques are flawed and all lead to partial approximations of the real world.  Thus, all truth must be built on the provisional and tumultuous foundation of reasoned debate and human judgment.


References & Further Reading


Abrams, M. H.  (1953).  The mirror and the lamp: Romantic theory and the critical tradition.  Oxford, UK: Oxford University Press.

And man made life.  (2010, May 22).  The Economist.  Retrieved Dec. 3, 2012, from

Akerlof, G. A., & Shiller, R. J.  (2015).  Phishing for phools: The economics of manipulation and deception.  Princeton: Princeton University Press.

Ariely, D.  (2008).  Predictably irrational: The hidden forces that shape our decisions.  New York: Harper Perennial.

Aristotle.  (1995).  Rhetoric.  In Jonathan Barnes (Ed.), The complete works of Aristotle: The revised Oxford translation, Vol 2.  (pp. 2152-2269).  Princeton, NJ: Princeton University Press.

Basken, P.  (2012, Oct 1).  Misconduct, not error, found behind most journal retractions.  The Chronicle of Higher Education.  Retrieved Dec. 3, 2012, from  

Beach, J. M.  (2012).  Kenneth Burke: A sociology of knowledge: Dramatism, ideology and rhetoric.  Austin, TX: West by Southwest Press.  

Berlin, I. (2000).  Historical inevitability, In The proper study of mankind.  New York: Farrar, Straus and Giroux.

Bernays, E. L.  (2011).  Crystallizing public opinion.  Brooklyn, NY: Ig Publishing.  (Original work published 1923)

Bernays, E. L.  (2005).  Propaganda.  Brooklyn, NY: Ig Publishing.  (Original work published 1928)

Bloor, D.  (1991).  Knowledge and social imagery.  2nd ed.  Chicago: University of Chicago Press.

Bly, R. W.  (2005).  The copywriter’s handbook: A step-by-step guide to writing copy that sells.  3rd ed.  New York: Owl Books.

Burke, K.  (1969).  A rhetoric of motives.  Berkeley, CA: University of California Press. (Original work published 1950)

Burke, K.  (1973).  The philosophy of literary form.  Berkeley, CA: University of California Press. (Original work published 1941)

Chang, K.  (2012, Sept 24).  Bias persists for women of science, a study finds.  The New York Times.  Retrieved Dec. 3, 2012, from

Cole, J. R.  (2009).  The great American university: Its rise to preeminence, its indispensable national role, and why it must be protected.  New York: Public Affairs.

Crawford, M. B.  (2009).  Shop class as soulcraft: An inquiry into the value of work.  New York: Penguin.

D'Andrade, R.  (2002).  Cultural Darwinism and language.  American Anthropologist 104(1): 223-232.

Deeds, not words.  (2012, Sept 15).  The Economist.  Retrieved Dec. 3, 2012, from

Demos, J.  (2008).  The enemy within: 2,000 years of witch-hunting in the western world.  New York: Viking.

Dennett, D. C.  (2003).  Freedom evolves.  New York: Viking.

Deutsch, D.  (1997). The fabric of reality: The science of parallel universes - and its implications.  New York: Allen Lane.

Diamond, J., & Robinson, J. A. (Eds.).  (2010).  Natural experiments of history.  Cambridge, MA: Harvard University Press.

Dunning, D.  (2014, Oct 27).  We are all confident idiots.  Pacific Standard: The Science of Society.  Retrieved Nov 2 2014 from

Eagleton, T.  (1991).  Ideology: An introduction.  London: Verso.

Emerson, R. W. (1957).  Experience.  In Selections from Ralph Waldo Emerson.  Boston: Houghton Mifflin.  (Original work published 1844)

Ewen, S.  (1996).  PR! A social history of spin.  New York: Basic Books.

Experimental psychology: The roar of the crowd.  (2012, May 26).  The Economist. Retrieved Dec. 3, 2012, from

Feyerabend, P.  (2010).  Against method.  4th ed.  London: Verso.  (Original work published 1975)

Finkbeiner, A.  (2006).  The Jasons: The secret history of science's postwar elite.  New York: Viking.

Flanagan, O.  (2007).  The really hard problem: Meaning in a material world.  Cambridge, MA: MIT Press.

Flanagan, O.  (2011).  The Bodhisattva's brain: Buddhism naturalized.  Cambridge, MA: MIT Press.

Frank, T.  (2000).  One market under God: Extreme capitalism, market populism, and the end of economic democracy.  New York: Anchor Books.

Frankfurt, H. G. (2005).  On bullshit.  Princeton: Princeton University Press.

Freedman, D. H.  (2010, Nov).  Lies, damned lies, and medical science.  The Atlantic.  Retrieved Dec. 3, 2012, from

Galbraith, J. K.  (2001).  The concept of the conventional wisdom.  In The essential Galbraith.  New York: Mariner Books.

Galileo.  (1957).  The assayer.  In Discoveries and Opinions of Galileo (pp. 217-280).  New York: Anchor Books.  (Original work published 1623)

Gaukroger, S.  (2001).  Francis Bacon and the transformation of early-modern philosophy.  Cambridge, UK: Cambridge University Press.

Gay, P.  (1995).  The enlightenment: The rise of modern paganism. New York: W. W. Norton.

Geertz, C.  (2000).  Ideology as a cultural system.  In The interpretation of cultures.  New York: Basic Books.  (Original work published 1973) 

Geertz, C.  (2000).  Common sense as a cultural system.  In Local Knowledge.  New York: Basic Books.  (Original work published 1983)

Gray, J.  (1995).  Enlightenment's Wake.  London: Routledge.

Greene, J. D.  (2002).  The terrible, horrible, no good, very bad truth about morality and what to do about it.  Unpublished doctoral dissertation, Princeton University, Princeton.

Gross, C. (2012, Jan 9/16).  Disgrace.  The Nation.  Retrieved Dec. 3, 2012, from

Hammersley, M., & Atkinson, P.  (2003).  Ethnography: Principles in practice.  2nd ed.  London: Routledge.  

Huff, D. (1993).  How to lie with statistics.  New York: WW Norton & Company.

Hume, D. (1888).  Treatise of human nature.  Oxford, UK: Clarendon Press.  (Original work published 1739)

Igo, S. E.  (2007).  The averaged American: Surveys, citizens, and the making of a mass public.  Cambridge, MA: Harvard University Press.

Isaacson, W.  (2007).  Einstein: His life and universe.  New York: Simon & Schuster.

Jacoby, S.  (2009).  The age of American unreason.  Revised Edition.  New York: Vintage.

Journalistic deficit disorder.  (2012, Sept 22).  The Economist.  Retrieved Dec. 3, 2012, from

Judson, H. F.  (2004).  The great betrayal: Fraud in science.  New York: Harcourt.

Kahneman, D.  (2011).  Thinking, fast and slow.  New York: Farrar, Straus and Giroux.

Kant, I.  (1994).  Critique of pure reason.  London: Everyman's Library. (Original work published 1781)

Kihlstrom, J. F.  (2013, Spring).  Threats to reason in moral judgment.  The Hedgehog Review, 15(1), 8-18.

Kirsch, I.  (2010).  The emperor's new drugs: Exploding the antidepressant myth. New York: Basic Books.

Klee, R.  (1999). Introduction.  In R. Klee (Ed.), Scientific inquiry: Readings in the philosophy of science (pp. 1-4)Oxford: Oxford University Press.

Klein, J.  (2003).  Francis Bacon.  Stanford Encyclopedia of Philosophy.  Retrieved Dec. 3, 2012, from entries/francis-bacon/

Kuhn, T.  (1996).  The structure of scientific revolutions. 3rd ed.  Chicago: University of Chicago Press.

Levitt, S. D., & Dubner, S. J.  (2009).  Freakonomics: A rogue economist explores the hidden side of everything.  New York: Harper Perennial.

Lindblom, C. E.  (1990).  Inquiry and change: The troubled attempt to understand and shape society.  New Haven: Yale University Press.

Lindblom, C. E., & Cohen, D. K.  (1979). Usable knowledge: Social science and social problem solving.  New Haven: Yale University Press.

Lindley, D.  (2008).  Uncertainty: Einstein, Heisenberg, Bohr, and the struggle for the soul of science.  New York: Anchor Books.

Lindstrom, M.  (2011).  Brandwashed: Tricks companies use to manipulate our minds and persuade us to buy.  New York: Crown.

Lindstrom, M.  (2010).  Buyology: Truth and lies about why we buy.  New York: Crown.

Lippmann, W.  (1997).  Public opinion.  New York: Free Press.  (Original work published 1922)

Malkiel, B. G.  (2012).  A random walk down wall street: The time-tested strategy for successful investing.  New York: W. W. Norton & Company.

Mayr, E. (1997).  This is biology: The science of the living world.  Cambridge, MA: Harvard University Press.

Mearsheimer, J. J. (2011).  Why leaders lie: The truth about lying in international politics.  Oxford, UK: Oxford University Press.

Morgan, E. S.  (1988).  Inventing the people: The rise of popular sovereignty in England and America.  New York: W. W. Norton.

Moss, M.  (2013, Feb 20).  The extraordinary science of addictive junk food.  The New York Times, Retrieved Feb 21 from

Norman Borlaug.  (2009, 19 Sept).  The Economist. Retrieved Dec. 3, 2012, from

Nussbaum, M. C. (1997).  Cultivating humanity: A classical defense of reform in liberal education.  Cambridge, MA: Harvard University Press.

Oreskes, N., & Conway, E. M.  (2010).  Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming.  New York: Bloomsbury Press.

Packard, V.  (2007).  The hidden persuaders.  Brooklyn, NY: Ig Publishing.  (Original work published 1957)

Pinker, S.  (1997).  How the mind works.  New York: W. W. Norton.

Pinker, S.  (2002).  The blank slate: The modern denial of human nature. New York: Penguin.

Plato.  (1997).  Republic.  In J. M. Cooper (Ed.), Plato: Complete works (pp. 971-1223).  Indianapolis, IN: Hackett Publishing.

Polanyi, M.  (1962).  Personal knowledge: Towards a post-critical philosophy.  Chicago, IL: University of Chicago Press.

Polanyi, M.  (1964).  Science, faith and society: A searching examination of the meaning and nature of scientific inquiry.  Chicago, IL: University of Chicago Press.

Popkin, S. L.  (1994).  The reasoning voter: Communication and persuasion in presidential campaigns.  2nd ed.  Chicago, IL: University of Chicago Press.

Popper, K.  (2002). The logic of scientific discovery.  London: Routledge.  (Original work published 1959)

Popper, K.  (1979).  Objective knowledge: An evolutionary approach.  Revised edition. Oxford: Oxford University Press.

Reitman, J. (Director & Writer).  (2005).  Thank you for smoking.  United States: Fox Searchlight Pictures.

Rorty, R.  (1979).  Philosophy and the mirror of nature.  Princeton, NJ: Princeton University Press. 

Sachs, J. D.  (2011).  The price of civilization: Reawakening American virtue and prosperity.  New York: Random House.

Sagan, K.  (1996).  The demon-haunted world: Science as a candle in the dark. New York: Random House.

Sen, A. (2009).  The idea of justice.  Cambridge, MA: Harvard University Press.

Shapin, S.  (2010). Never pure: Historical studies of science as if it was produced by people with bodies, situated in time, space, culture, and society, and struggling for credibility and authority.  Baltimore, MD: Johns Hopkins University Press.

Shenkman, R.  (2008).  Just how stupid are we? Facing the truth about the American voter.  New York: Basic Books.

Solow, R. M.  (1997).  How did economics get that way and what way did it get?  In T. Bender & C. E. Schorske (Eds.), American academic culture in transformation (pp. 57-76). Princeton, NJ: Princeton University Press.

Taubes, G.  (2007).  Good calories, bad calories.  New York: Anchor. 

Thaler, R. H.  (2015).  Misbehaving: The making of behavioral economics.  New York, W. W. Norton.

Thaler, R. H., & Sunstein, C. R. (2008).  Nudge: Improving decisions about health, wealth, and happiness.  New Haven: Yale University Press.  

The death of facts in an age of truthiness.  (2012, April 29).  National Public Radio.  Retrieved Dec. 3, 2012, from 

Toulmin, S.  (1958).  The uses of argument.  Cambridge, UK: Cambridge University Press.

Toulmin, S. (1961).  Foresight and understanding: An inquiry into the aims of science.  New York: Harper.

Toulmin, S.  (2001).  Return to reason.  Cambridge, MA: Harvard University Press.

Toulmin, S., Rieke, R., & Janik, A.  (1979).  An introduction to reasoning.  New York: Macmillan.

Tye, L.  (1998).  The father of spin: Edward L. Bernays & the birth of public relations.  New York: Crown.

Watters, E.  (2013, March/April).  We aren’t the world.  Pacific Standard, 46-53.

Wheelan, C.  (2013).  Naked statistics: Stripping the dread from the data.  New York: W. W. Norton.

Zimmer, C.  (2012, April 16).  A sharp rise in retractions prompts calls for reform.  The New York Times.  Retrieved     Dec. 3, 2012, from


To cite this chapter in a reference page using APA:

Beach, J. M.  (2013).  Title of chapter.  In 21st century literacy: Constructing & debating knowledge.  Retrieved date from


To cite this chapter in an in-text citation using APA:

(Beach, 2013, ch 5, para. #).



© J. M. Beach 2013, Revised 2016