And as I’ve said, we can commit a logical mistake in reality but not in the classroom . This asymmetry is best visible in cancer detection. Take doctors examining a patient for signs of cancer; tests are typically done on patients who want to know if they are cured or if there is “recurrence.” (In fact, recurrence is a misnomer; it simply means that the treatment did not kill all the cancerous cells and that these undetected malignant cells have started to multiply out of control.) It is not feasible, in the present state of technology, to examine every single one of the patient’s cells to see if all of them are nonmalignant, so the doctor takes a sample by scanning the body with as much precision as possible. Then she makes an assumption about what she did not see. I was once taken aback when a doctor told me after a routine cancer checkup, “Stop worrying, we have evidence of cure.” “Why?” I asked. “There is evidence of no cancer” was the reply. “How do you know?” I asked. He replied, “The scan is negative.” Yet he went around calling himself doctor!
An acronym used in the medical literature is NED, which stands for No Evidence of Disease. There is no such thing as END, Evidence of No Disease. Yet my experience discussing this matter with plenty of doctors, even those who publish papers on their results, is that many slip into the round-trip fallacy during conversation.
Doctors in the midst of the scientific arrogance of the 1960s looked down at mothers’ milk as something primitive, as if it could be replicated by their laboratories—not realizing that mothers’ milk might include useful components that could have eluded their scientific understanding—a simple confusion of absence of evidence of the benefits of mothers’ milk with evidence of absence of the benefits (another case of Platonicity as “it did not make sense” to breast-feed when we could simply use bottles). Many people paid the price for this naïve inference: those who were not breast-fed as infants turned out to be at an increased risk of a collection of health problems, including a higher likelihood of developing certain types of cancer—there had to be in mothers’ milk some necessary nutrients that still elude us. Furthermore, benefits to mothers who breast-feed were also neglected, such as a reduction in the risk of breast cancer.
Likewise with tonsils: the removal of tonsils may lead to a higher incidence of throat cancer, but for decades doctors never suspected that this “useless” tissue might actually have a use that escaped their detection. The same with the dietary fiber found in fruits and vegetables: doctors in the 1960s found it useless because they saw no immediate evidence of its necessity, and so they created a malnourished generation. Fiber, it turns out, acts to slow down the absorption of sugars in the blood and scrapes the intestinal tract of precancerous cells. Indeed medicine has caused plenty of damage throughout history, owing to this simple kind of inferential confusion.
I am not saying here that doctors should not have beliefs, only that some kinds of definitive, closed beliefs need to be avoided—this is what Menodotus and his school seemed to be advocating with their brand of skeptical-empirical medicine that avoided theorizing. Medicine has gotten better—but many kinds of knowledge have not.
Evidence
By a mental mechanism I call naïve empiricism, we have a natural tendency to look for instances that confirm our story and our vision of the world—these instances are always easy to find. Alas, with tools, and fools, anything can be easy to find. You take past instances that corroborate your theories and you treat them as evidence . For instance, a diplomat will show you his “accomplishments,” not what he failed to do. Mathematicians will try to convince you that their science is useful to society by pointing out instances where it proved helpful, not those where it was a waste of time, or, worse, those numerous mathematical applications that inflicted a severe cost on society owing to the highly unempirical nature of elegant mathematical theories.
Even in testing a hypothesis, we tend to look for instances where the hypothesis proved true. Of course we can easily find confirmation; all we have to do is look, or have a researcher do it for us. I can find confirmation for just about anything, the way a skilled London cabbie can find traffic to increase the fare, even on a holiday.
Some people go further and give me examples of events that we have been able to foresee with some success—indeed there are a few, like landing a man on the moon and the economic growth of the twenty-first century. One can find plenty of “counterevidence” to the points in this book, the best being that newspapers are excellent at predicting movie and theater schedules. Look, I predicted yesterday that the sun would rise today, and it did!
NEGATIVE EMPIRICISM
The good news is that there is a way around this naïve empiricism. I am saying that a series of corroborative facts is not necessarily evidence. Seeing white swans does not confirm the nonexistence of black swans. There is an exception, however: I know what statement is wrong, but not necessarily what statement is correct. If I see a black swan I can certify that all swans are not white! If I see someone kill, then I can be practically certain that he is a criminal. If I don’t see him kill, I cannot be certain that he is innocent. The same applies to cancer detection: the finding of a single malignant tumor proves that you have cancer, but the absence of such a finding cannot allow you to say with certainty that you are cancer-free.
We can get closer to the truth by negative instances, not by verification! It is misleading to build a general rule from observed facts. Contrary to conventional wisdom, our body of knowledge does not increase from a series of confirmatory observations, like the turkey’s. But there are some things I can remain skeptical about, and others I can safely consider certain. This makes the consequences of observations one-sided. It is not much more difficult than that.
This asymmetry is immensely practical. It tells us that we do not have to be complete skeptics, just semiskeptics. The subtlety of real life over the books is that, in your decision making, you need be interested only in one side of the story: if you seek certainty about whether the patient has cancer, not certainty about whether he is healthy, then you might be satisfied with negative inference, since it will supply you the certainty you seek. So we can learn a lot from data—but not as much as we expect. Sometimes a lot of data can be meaningless; at other times one single piece of information can be very meaningful. It is true that a thousand days cannot prove you right, but one day can prove you to be wrong.
The person who is credited with the promotion of this idea of one-sided semiskepticism is Sir Doktor Professor Karl Raimund Popper, who may be the only philosopher of science who is actually read and discussed by actors in the real world (though not as enthusiastically by professional philosophers). As I am writing these lines, a black-and-white picture of him is hanging on the wall of my study. It was a gift I got in Munich from the essayist Jochen Wegner, who, like me, considers Popper to be about all “we’ve got” among modern thinkers—well, almost. He writes to us, not to other philosophers. “We” are the empirical decision makers who hold that uncertainty is our discipline, and that understanding how to act under conditions of incomplete information is the highest and most urgent human pursuit.
Popper generated a large-scale theory around this asymmetry, based on a technique called “falsification” (to falsify is to prove wrong) meant to distinguish between science and nonscience, and people immediately started splitting hairs about its technicalities, even though it is not the most interesting, or the most original, of Popper’s ideas. This idea about the asymmetry of knowledge is so liked by practitioners, because it is obvious to them; it is the way they run their business. The philosopher maudit Charles Sanders Peirce, who, like an artist, got only posthumous respect, also came up with a version of this Black Swan solution when Popper was wearing diapers—some people even called it the Peirce-Popper approach. Popper’s far more powerful and original idea is the “open” society, one that relies on skepticism as a modus operandi, refusing and resisting definitive truths. He accused Plato of closing our minds, according to the arguments I described in the Prologue. But Popper’s biggest idea was his insight concerning the fundamental, severe, and incurable unpredictability of the world, and that I will leave for the chapter on prediction. *
Читать дальше