When it comes to the possibility of something happening, history by now offers a pretty impressive palette of different types of events. We might think of these as ‘known unknowns’, as it is clearly within our reach to form an understanding of them. What even the most creative and superbly educated mind cannot envision, however, is the magnitude of the consequences of those events as they would play out in the present day. The dynamics are truly impossible to imagine, and hence the consequences. History only repeats itself for the most part. With respect to the consequences, most of us are suckers, especially when it comes to our own lives and times. Whenever something impactful but entirely conceivable hits our vicinity, we are stunned no matter what.
The previous section referred briefly to the ‘not in our lifetime’ perspective. Let us linger on this point for a bit, as it is one of the keys to understanding Black Swans and why we are essentially born suckers. Most of us will freely admit that humanity is in for one disaster or other. Sooner or later, that asteroid will knock us out of our pants, for sure, but it is always later, somewhere out in the distant future. It is not going to happen in my lifetime. Why? Because I am somehow special. Stuff only happens to other people, whereas I am destined to lead a glorious and comfortable existence. Based on such egocentric beliefs, we might coolly concede that in the larger scheme of things, something is sure to happen, yet almost completely discount the possibility as far as our lifetime and corner of the world is concerned. Not that we always say so publicly or even think in those terms outright, it is more of a tacit assumption.
This ‘because I'm special' protective mechanism goes a long way in setting the stage for Black Swans. However, it is only one of the many ways in which our outlook is warped, which brings us to the long catalogue of biases that have been identified and described by scholars. A bias can be said to be a predisposition to make a mistake in a decision‐making situation, because it leads us away from the decision that would be taken by a rational and well‐informed person who diligently weighs pros and cons. What biases tend to have in common is that they make us more of a sucker than we need to be. They are a staple of business books nowadays (those on risk management in particular) and may bore the educated reader. Since they are so fundamental to the concept of Black Swans, we must briefly review them nonetheless. What follows is a non‐exhaustive list of certain well‐documented biases that in various ways contribute to the Black Swan phenomenon. As is commonly pointed out, these biases have been mostly to our advantage over the long evolutionary haul, but are often liabilities in the unnatural and complex environment we find ourselves in today.
The narrative fallacyIn explaining why we are so poorly equipped to deal with randomness, Taleb focuses on what he refers to as ‘the narrative fallacy’, which he defines as ‘our need to fit a story or pattern to a series of connected or disconnected facts’ (Taleb, 2007, p. 309). We invent causes for things that we observe in order to satisfy our need for coherent explanations. It turns out that we do not suffer dissonance gladly, so our brain will supply any number of rationalizations to connect the dots. By reducing the number of dimensions of the problem at hand and creating a neat narrative, things become more orderly. Everything starts to hang together and make sense, and that is how the dissonance is resolved. Since we are lazy as well, we often converge on the rationalization that satisfies our craving with the least amount of resistance. However, when we force causal interpretations on our reality, and invent stories that satisfy our need for explanations, we make ourselves blind to powerful mechanisms that lie outside these simple narratives.
Confirmation biasThis is one of the leading causes of Swan‐blindness discussed in The Black Swan, where Taleb refers to confirmation as ‘a dangerous error’ (Taleb, 2007, p. 51). It has to do with the general tendency to adopt a theory or idea and then start to look for evidence that corroborates it. When we suffer from this bias, all the incoming data seems, as if by magic, to confirm that the belief we hold is correct; that the theory we are so fond of is indeed true. Whatever instances contradict the theory are brushed aside or ignored, or re‐interpreted (tweaked) in a way that supports our pre‐existing beliefs. Out the window goes Karl Popper's idea of falsification, the true marker of science and open inquiry. Using falsification as a criterion, a theory is discarded if evidence contradicting it becomes undeniable. In the specific context of managing risks, the confirmation bias is a problem because we will be too prone to interpret incoming observations of stability to suggest that the future will be similarly benign.
The optimistic biasResearch has shown that humans tend to view the world as more benign than it really is. Consequently, in a decision‐making situation, people tend to produce plans and forecasts that are unrealistically close to a best‐case scenario. 14 The evidence shows that this is a bias with major consequences for risk taking. In the words of Professor Daniel Kahneman (2011): ‘The evidence suggests that an optimistic bias plays a role – sometimes the dominant role – whenever individuals or institutions voluntarily take on significant risks. More often than not, risk takers underestimate the odds they face, and do not invest sufficient effort to find out what they are.’ 15 Pondering on extreme and possibly calamitous outcomes will clearly not be a priority for an individual with an optimistic bent. Taking a consistently rosy view distorts expectations and therefore invites the Black Swan.
The myopia biasMyopia, in the literature on the psychology of judgement, refers to the tendency to focus more on short‐term consequences than long‐term implications. Because of our desire for instant gratification, we tend to place much less weight on future gains and losses relative to those in the near‐term. Professors Meyer and Kunreuther call this the most ‘crippling’ of all biases, resulting in gross underpreparedness for disasters that could have been mitigated with relatively simple measures. 16 This was the case, for example, with the tsunami in the Indian Ocean in 2004. Only a few years prior, in Thailand, relatively inexpensive mitigation measures had been discussed – and dismissed. The reason? There were many reasons, but among other things, there was a worry that it might cause unnecessary alarm among tourists. Such miniscule short‐term benefits got the upper hand in preparing for events with colossal consequences.
The overconfidence biasHumans are prone to overrate their own abilities and the level of control they have over a situation. The typical way of exemplifying this tendency is to point to the fact that nearly everyone considers himself an above‐average driver. Taleb prefers the more humorous example of how most French people rate themselves well above the rest in terms of the art of love‐making (Taleb, 2007, p. 153). As for the effect of overconfidence on decision‐making, it is profound – and not in a favourable way. Professor Scott Plous (1993) argues that a large number of catastrophic events, such as the Chernobyl nuclear accident and the Space Shuttle Challenger explosion, can be traced to overconfidence. He offers the following summary: ‘No problem […] in decision‐making is more prevalent and more potentially catastrophic than overconfidence.’ 17 Overconfidence has been used to explain a wide range of observed phenomena, such as entrepreneurial market entry and trading in financial markets, despite available data suggesting high failure rates.
Читать дальше