Tom Phillips - Humans - A Brief History of How We F*cked It All Up

Здесь есть возможность читать онлайн «Tom Phillips - Humans - A Brief History of How We F*cked It All Up» весь текст электронной книги совершенно бесплатно (целиком полную версию без сокращений). В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Город: Toronto, Год выпуска: 2019, ISBN: 2019, Издательство: Hanover Square Press, Жанр: История, Юмористические книги, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Humans: A Brief History of How We F*cked It All Up: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Humans: A Brief History of How We F*cked It All Up»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

“A thoroughly entertaining account of human follies and foibles from ancient times to the present.”

Humans: A Brief History of How We F*cked It All Up — читать онлайн бесплатно полную книгу (весь текст) целиком

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Humans: A Brief History of How We F*cked It All Up», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Wherever it comes from, it’s an idea that’s been fixed in culture for a long time. And once you’ve been told about the idea that the full moon means crazytime, you’re much more likely to remember all the times that it did happen—and forget the times it didn’t. Without meaning to, your brain has created a pattern out of randomness.

Again, this is because of those mental shortcuts our brains use. Two of the main shortcuts are the “anchoring heuristic” and the “availability heuristic,” and they both cause us no end of bother.

Anchoring means that when you make up your mind about something, especially if you don’t have much to go on, you’re disproportionately influenced by the first piece of information you hear. For example, imagine you’re asked to estimate how much something costs, in a situation where you’re unlikely to have the knowledge to make a fully informed judgment—say, a house you’re shown a picture of. (Note for millennials: houses are those big things made of bricks you’ll never be able to buy.) Without anything else to go on, you might just look at the picture, see roughly how fancy it looks and make a wild stab in the dark. But your guess can be dramatically skewed if you’re given a suggested figure to begin with—for example, in the form of a preceding question such as “Do you think this house is worth more or less than $400,000?” Now, it’s important to realize that question hasn’t actually given you any useful information at all (it’s not like, say, being told what other houses in the area have recently sold for). And yet people who get prompted with a figure of $600,000 will end up estimating the house’s value much higher on average than people who are prompted with $200,000. Even though the preceding question isn’t informative at all, it still affects your judgment, because you’ve been given an “anchor”—your brain seizes on it as a starting point for making its guess, and adjusts from there.

We do this to an almost ridiculous degree: the piece of information we use as an anchor can be as explicitly unhelpful as a randomly generated number, and our brains will still latch on to it and skew our decisions toward it. This can get frankly worrying; in his book Thinking, Fast and Slow , Daniel Kahneman gives the example of a 2006 experiment on a group of highly experienced German judges. They were shown details of a court case in which a woman was found guilty of shoplifting. They were then asked to roll a pair of dice, which (unknown to them) were weighted to only ever produce a total of 3 or 9. Then they were asked if the woman should be sentenced to more or fewer months than the figure produced by the dice, before finally being asked to provide their best recommendation for how long her sentence should be.

You can pretty much guess the result: the judges who rolled the higher figure on the dice sentenced her to much longer in prison than the ones who rolled low. On average, the roll of the dice would have seen the woman spend an extra three months in jail. This is not comforting.

Availability, meanwhile, means that you make judgment calls on the basis of whatever information comes to mind easiest, rather than deeply considering all the possible information that might be available to you. And that means we’re hugely biased toward basing our worldview on stuff that’s happened most recently, or things that are particularly dramatic and memorable, while all the old, mundane stuff that’s probably a more accurate representation of everyday reality just sort of… fades away.

It’s why sensational news stories about horrible crimes make us think that crime levels are higher than they are, while dry stories about falling crime statistics don’t have anywhere near as much impact in the opposite direction. It’s one reason why many people are more scared of plane crashes (rare, dramatic) than they are of car crashes (more common and as such a bit less exciting). And it’s why terrorism can produce instant knee-jerk responses from the public and politicians alike, while far more deadly but also more humdrum threats to life get brushed aside. More people were killed by lawn mowers than by terrorism in the USA in the decade between 2007 and 2017, but at the time of writing, the US government has yet to launch a War on Lawn Mowers. (Although, let’s be honest, given recent events you wouldn’t rule it out.)

Working together, the anchoring heuristic and the availability heuristic are both really useful for making snap judgments in moments of crisis, or making all those small, everyday decisions that don’t have much impact. But if you want to make a more informed decision that takes into account all the complexity of the modern world, they can be a bit of a nightmare. Your brain will keep trying to slide back to its evidential comfort zone of whatever you heard first, or whatever comes to mind most quickly.

They’re also part of the reason why we’re terrible at judging risk and correctly predicting which of the many options available to us is the one least likely to lead to catastrophe. We actually have two separate systems in our minds that help us judge the danger of something: the quick, instinctive one and a slow, considered one. The problems start when these conflict. One part of your brain is quietly saying, “I’ve analyzed all the evidence and it appears that Option 1 is the riskiest alternative,” while another part of your brain is shouting, “Yes, but Option 2 SEEMS scary.”

Sure, you might think, but luckily we’re smarter than that. We can force our brains out of that comfort zone, can’t we? We can ignore the instinctive voice and amplify the considered voice, and so objectively consider our situation, right? Unfortunately, that doesn’t take confirmation bias into account.

Before I began researching this book, I thought that confirmation bias was a major problem, and everything I’ve read since then convinces me that I was right. Which is exactly the problem: our brains hate finding out that they’re wrong. Confirmation bias is our annoying habit of zeroing in like a laser-guided missile on any scrap of evidence that supports what we already believe, and blithely ignoring the possibly much, much larger piles of evidence that suggest we might have been completely misguided. At its mildest, this helps explain why we prefer to get our news from an outlet that broadly agrees with our political views. In a more extreme instance, it’s why you can’t argue a conspiracy theorist out of their beliefs, because we cherry-pick the events that back up our version of reality and discard the ones that don’t.

Again, this is quite helpful in some ways: the world is complex and messy and doesn’t reveal its rules to us in nice, simple PowerPoint presentations with easy-to-read bullet points. Coming up with any kind of mental model of the world means discarding useless information and focusing on the right clues. It’s just that working out what information is the stuff worth paying attention to is a bit of a cognitive crapshoot.

It gets worse, though. Our brain’s resistance to the idea that it might have screwed up goes deeper. You’d think that once we’d made a decision, put it into action and actually seen it start to go horribly wrong , we would then at least become a bit better at changing our minds. Hahaha, no. There’s a thing called “choice-supportive bias,” which basically means that once we’ve committed to a course of action, we cling on to the idea that it was the right choice like a drowning sailor clinging to a plank. We even replay our memories of how and why we made that choice in an attempt to back ourselves up. In its mild form, this is why you end up hobbling around in agony after buying a new pair of shoes, insisting to everybody that “they make me look POWERFUL yet ALLURING.” In a stronger form, it is why government ministers continue to insist that the negotiations are going very well and a lot of progress has been made even as it becomes increasingly apparent that everything is going quite profoundly to shit. The choice has been made, so it must have been the right one, because we made it.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Humans: A Brief History of How We F*cked It All Up»

Представляем Вашему вниманию похожие книги на «Humans: A Brief History of How We F*cked It All Up» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Humans: A Brief History of How We F*cked It All Up»

Обсуждение, отзывы о книге «Humans: A Brief History of How We F*cked It All Up» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x