The anti-vaccination movement not only indicates the power of anti-government feelings, but also important aspects of human thinking in general. We don’t like to live with uncertainties and the unknown. We want answers. My child has autism, and I want to know why. What caused it? For many, “we don’t know” is not acceptable, and they seek some sort of answer. Johann Wolfgang von Goethe said, “Mysteries are not necessarily miracles.” But for bad things, we look beyond miracles and often settle on conspiracies. Autism had to be caused by something, and the vaccine is blamed. We will make up an answer if no answer is available.

Misinformation that has taken root is hard to eradicate. Once we believe something, we want to continue to hang on to the belief. Instances of this are legion, and all of us can cite particulars. This trait permeates all strata of society. For example, a recent article by Gina Kolata,, reports that nearly 400 routine medical “practices were flatly contradicted by studies published in leading journals.” An example: Ginkgo biloba is widely promoted as a memory aid, but a large study published in 2008 “definitively showed the supplement is useless for this purpose. Yet Gingko still pulls in $249 million in sales.” In spite of the scientific research, the gingko biloba myth lives. What T. H. Huxley said about science, indicating how it advances, should be a benchmark for all of society: “The great tragedy of science—the slaying of a beautiful hypothesis by an ugly fact.” But all too often facts do not win out over what we believe.

Moreover, misinformation can easily take root because our instinctive reactions to empirical propositions are often wrong and because our education often does not do enough to train us to think more clearly about empirical propositions. The ground-breaking work of psychologists and economists Daniel Kahneman and Amos Tversky shows that human judgment and decision-making are frequently flawed in predictable ways. Of course, many people know of their studies, but they should be even more widely known than they are. Their research applies not merely to some academic or specialized field, but to all of us as we make judgments about the world in all aspects of our lives. Kahneman and Tversky’s seminal book written with Paul Slovic in 1982, Judgment Under Uncertainty: Heuristics and Biases is not only readable but also fun as is Kahneman’s 2011 Thinking, Fast and Slow. Critical thinking should be part of our basic education, and the wit, wisdom, and research of Kahneman and Tversky should be included.

Critical thinking would also be advanced by the increased teaching of probabilities and statistics in our schools. This thought stems from my own high school math education, which had what might be considered the usual geometry, trigonometry, and algebra. These all helped my thinking, and while I have probably used that geometry more than I realize, I am like many others who say trig and algebra haven’t cropped up much in their adult lives. But I also took a course that had units that included calculus, set theory, and probability and statistics. As I look back, I realize that probability and statistics has been most important to my thinking through the years.

As most of us do, I encounter news of polls and medical and scientific studies, and their meaningfulness requires some understanding of probabilities and statistics as well as does much of the sports information I absorb. But most important was that P & S taught me important things about critical thinking in general, and it taught me to recognize my own and others’ sloppy thinking when we make factual assertions.

Even before reading Kahneman and Tversky, I had realized how bad intuitions, including mine, were about empirical propositions and that something more objective than gut feelings were needed. Probabilities and statistics helped me realize, for example, that a temporal sequence does not necessarily mean a cause and effect and that correlation and causation are separate concepts. Nevertheless, we all too easily fail to recognize our flawed causational judgments. A case in point: Just because autism is diagnosed after the MMR vaccine is given does not establish that the vaccine causes measles.

I learned from probability and statistics that comparisons are often needed to advance empirical knowledge but devising and assembling adequate control groups can be a tricky business. And we need to have some understanding of the statistics used to make the comparisons.

However, many of us who have not been deeply trained in science and math simply throw up our hands when we encounter either and fall back on some shortcut to determine what we decide to believe. My law school teaching showed me that time and again.

(concluded July 26)

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s