I remind the reader that I am not testing how much people know, but assessing the difference between what people actually know and how much they think they know . I am reminded of a measure my mother concocted, as a joke, when I decided to become a businessman. Being ironic about my (perceived) confidence, though not necessarily unconvinced of my abilities, she found a way for me to make a killing. How? Someone who could figure out how to buy me at the price I am truly worth and sell me at what I think I am worth would be able to pocket a huge difference. Though I keep trying to convince her of my internal humility and insecurity concealed under a confident exterior; though I keep telling her that I am an introspector—she remains skeptical. Introspector shmintrospector, she still jokes at the time of this writing that I am a little ahead of myself.
BLACK SWAN BLINDNESS REDUX
The simple test above suggests the presence of an ingrained tendency in humans to underestimate outliers—or Black Swans. Left to our own devices, we tend to think that what happens every decade in fact only happens once every century, and, furthermore, that we know what’s going on.
This miscalculation problem is a little more subtle. In truth, outliers are not as sensitive to underestimation since they are fragile to estimation errors, which can go in both directions. As we saw in Chapter 6, there are conditions under which people overestimate the unusual or some specific unusual event (say when sensational images come to their minds)—which, we have seen, is how insurance companies thrive. So my general point is that these events are very fragile to miscalculation , with a general severe underestimation mixed with an occasional severe overestimation.
The errors get worse with the degree of remoteness to the event. So far, we have only considered a 2 percent error rate in the game we saw earlier, but if you look at, say, situations where the odds are one in a hundred, one in a thousand, or one in a million, then the errors become monstrous. The longer the odds, the larger the epistemic arrogance.
Note here one particularity of our intuitive judgment: even if we lived in Mediocristan, in which large events are rare (and, mostly, inconsequential), we would still underestimate extremes—we would think that they are even rarer. We underestimate our error rate even with Gaussian variables. Our intuitions are sub-Mediocristani. But we do not live in Mediocristan. The numbers we are likely to estimate on a daily basis belong largely in Extremistan, i.e., they are run by concentration and subjected to Black Swans.
Guessing and Predicting
There is no effective difference between my guessing a variable that is not random, but for which my information is partial or deficient, such as the number of lovers who transited through the bed of Catherine II of Russia, and predicting a random one, like tomorrow’s unemployment rate or next year’s stock market. In this sense, guessing (what I don’t know, but what someone else may know) and predicting (what has not taken place yet) are the same thing.
To further appreciate the connection between guessing and predicting, assume that instead of trying to gauge the number of lovers of Catherine of Russia, you are estimating the less interesting but, for some, more important question of the population growth for the next century, the stockmarket returns, the social-security deficit, the price of oil, the results of your great-uncle’s estate sale, or the environmental conditions of Brazil two decades from now. Or, if you are the publisher of Yevgenia Krasnova’s book, you may need to produce an estimate of the possible future sales. We are now getting into dangerous waters: just consider that most professionals who make forecasts are also afflicted with the mental impediment discussed above. Furthermore, people who make forecasts professionally are often more affected by such impediments than those who don’t.
INFORMATION IS BAD FOR KNOWLEDGE
You may wonder how learning, education, and experience affect epistemic arrogance—how educated people might score on the above test, as compared with the rest of the population (using Mikhail the cabdriver as a benchmark). You will be surprised by the answer: it depends on the profession. I will first look at the advantages of the “informed” over the rest of us in the humbling business of prediction.
I recall visiting a friend at a New York investment bank and seeing a frenetic hotshot “master of the universe” type walking around with a set of wireless headphones wrapped around his ears and a microphone jutting out of the right side that prevented me from focusing on his lips during my twenty-second conversation with him. I asked my friend the purpose of that contraption. “He likes to keep in touch with London,” I was told. When you are employed, hence dependent on other people’s judgment, looking busy can help you claim responsibility for the results in a random environment. The appearance of busyness reinforces the perception of causality, of the link between results and one’s role in them. This of course applies even more to the CEOs of large companies who need to trumpet a link between their “presence” and “leadership” and the results of the company. I am not aware of any studies that probe the usefulness of their time being invested in conversations and the absorption of small-time information—nor have too many writers had the guts to question how large the CEO’s role is in a corporation’s success.
Let us discuss one main effect of information: impediment to knowledge.
Aristotle Onassis, perhaps the first mediatized tycoon, was principally famous for being rich—and for exhibiting it. An ethnic Greek refugee from southern Turkey, he went to Argentina, made a lump of cash by importing Turkish tobacco, then became a shipping magnate. He was reviled when he married Jacqueline Kennedy, the widow of the American president John F. Kennedy, which drove the heartbroken opera singer Maria Callas to immure herself in a Paris apartment to await death.
If you study Onassis’s life, which I spent part of my early adulthood doing, you would notice an interesting regularity: “work,” in the conventional sense, was not his thing. He did not even bother to have a desk, let alone an office. He was not just a dealmaker, which does not necessitate having an office, but he also ran a shipping empire, which requires day-to-day monitoring. Yet his main tool was a notebook, which contained all the information he needed. Onassis spent his life trying to socialize with the rich and famous, and to pursue (and collect) women. He generally woke up at noon. If he needed legal advice, he would summon his lawyers to some nightclub in Paris at two A.M. He was said to have an irresistible charm, which helped him take advantage of people.
Let us go beyond the anecdote. There may be a “fooled by randomness” effect here, of making a causal link between Onassis’s success and his modus operandi. I may never know if Onassis was skilled or lucky, though I am convinced that his charm opened doors for him, but I can subject his modus to a rigorous examination by looking at empirical research on the link between information and understanding. So this statement, additional knowledge of the minutiae of daily business can be useless, even actually toxic , is indirectly but quite effectively testable.
Show two groups of people a blurry image of a fire hydrant, blurry enough for them not to recognize what it is. For one group, increase the resolution slowly, in ten steps. For the second, do it faster, in five steps. Stop at a point where both groups have been presented an identical image and ask each of them to identify what they see. The members of the group that saw fewer intermediate steps are likely to recognize the hydrant much faster. Moral? The more information you give someone, the more hypotheses they will formulate along the way, and the worse off they will be. They see more random noise and mistake it for information.
Читать дальше