No schema is an island: Ideas in our heads are connected in networks and hierarchies. Key isn’t a useful concept without lock, door, and a slew of other supporting ideas. If we change these concepts too quickly—altering our concept of door without adjusting lock, for example—we could end up removing or altering ideas that other ideas depend on and have the whole system come crashing down. Confirmation bias is a conservative mental force helping to shore up our schemata against erosion.
Learning, then, is a balance. Jean Piaget, one of the major figures in developmental psychology, describes it as a process of assimilation and accommodation. Assimilation happens when children adapt objects to their existing cognitive structures—as when an infant identifies every object placed in the crib as something to suck on. Accommodation happens when we adjust our schemata to new information—“Ah, this isn’t something to suck on, it’s something to make a noise with!” We modify our schemata to fit the world and the world to fit our schemata, and it’s in properly balancing the two processes that growth occurs and knowledge is built.
The filter bubble tends to dramatically amplify confirmation bias—in a way, it’s designed to. Consuming information that conforms to our ideas of the world is easy and pleasurable; consuming information that challenges us to think in new ways or question our assumptions is frustrating and difficult. This is why partisans of one political stripe tend not to consume the media of another. As a result, an information environment built on click signals will favor content that supports our existing notions about the world over content that challenges them.
During the 2008 presidential campaign, for example, rumors swirled persistently that Barack Obama, a practicing Christian, was a follower of Islam. E-mails circulated to millions, offering “proof” of Obama’s “real” religion and reminding voters that Obama spent time in Indonesia and had the middle name Hussein. The Obama campaign fought back on television and encouraged its supporters to set the facts straight. But even a front-page scandal about his Christian priest, Rev. Jeremiah Wright, was unable to puncture the mythology. Fifteen percent of Americans stubbornly held on to the idea that Obama was a Muslim.
That’s not so surprising—Americans have never been very well informed about our politicians. What’s perplexing is that since the election, the percentage of Americans who hold that belief has nearly doubled, and the increase, according to data collected by the Pew Charitable Trusts, has been greatest among people who are college educated. People with some college education were more likely in some cases to believe the story than people with none—a strange state of affairs.
Why? According to the New Republic ’s Jon Chait, the answer lies with the media: “Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated.” And while this phenomenon has always been true, the filter bubble automates it. In the bubble, the proportion of content that validates what you know goes way up.
Which brings us to the second way the filter bubble can get in the way of learning: It can block what researcher Travis Proulx calls “meaning threats,” the confusing, unsettling occurrences that fuel our desire to understand and acquire new ideas.
Researchers at the University of California at Santa Barbara asked subjects to read two modified versions of “The Country Doctor,” a strange, dreamlike short story by Franz Kafka. “A seriously ill man was waiting for me in a village ten miles distant,” begins the story. “A severe snowstorm filled the space between him and me.” The doctor has no horse, but when he goes to the stable, it’s warm and there’s a horsey scent. A belligerent groom hauls himself out of the muck and offers to help the doctor. The groom calls two horses and attempts to rape the doctor’s maid, while the doctor is whisked to the patient’s house in a snowy instant. And that’s just the beginning—the weirdness escalates. The story concludes with a series of non sequiturs and a cryptic aphorism: “Once one responds to a false alarm on the night bell, there’s no making it good again—not ever.”
The Kafka-inspired version of the story includes meaning threats—incomprehensible events that threaten readers’ expectations about the world and shake their confidence in their ability to understand. But the researchers also prepared another version of the story with a much more conventional narrative, complete with a happily-ever-after ending and appropriate, cartoony illustrations. The mysteries and odd occurrences are explained. After reading one version or the other, the study’s participants were asked to switch tasks and identify patterns in a set of numbers. The group that read the version adopted from Kafka did nearly twice as well—a dramatic increase in the ability to identify and acquire new patterns. “The key to our study is that our participants were surprised by the series of unexpected events, and they had no way to make sense of them,” Proulx wrote. “Hence, they strived to make sense of something else.”
For similar reasons, a filtered environment could have consequences for curiosity. According to psychologist George Lowenstein, curiosity is aroused when we’re presented with an “information gap.” It’s a sensation of deprivation: A present’s wrapping deprives us of the knowledge of what’s in it, and as a result we become curious about its contents. But to feel curiosity, we have to be conscious that something’s being hidden. Because the filter bubble hides things invisibly, we’re not as compelled to learn about what we don’t know.
As University of Virginia media studies professor and Google expert Siva Vaidhyanathan writes in “The Googlization of Everything”: “Learning is by definition an encounter with what you don’t know, what you haven’t thought of, what you couldn’t conceive, and what you never understood or entertained as possible. It’s an encounter with what’s other—even with otherness as such. The kind of filter that Google interposes between an Internet searcher and what a search yields shields the searcher from such radical encounters.” Personalization is about building an environment that consists entirely of the adjacent unknown—the sports trivia or political punctuation marks that don’t really shake our schemata but feel like new information. The personalized environment is very good at answering the questions we have but not at suggesting questions or problems that are out of our sight altogether. It brings to mind the famous Pablo Picasso quotation: “Computers are useless. They can only give you answers.”
Stripped of the surprise of unexpected events and associations, a perfectly filtered world would provoke less learning. And there’s another mental balance that personalization can upset: the balance between open-mindedness and focus that makes us creative.
The drug Adderall is a mixture of amphetamines. Prescribed for attention deficit disorder, it’s become a staple for thousands of overscheduled, sleep-deprived students, allowing them to focus for long stretches on a single arcane research paper or complex lab assignment.
For people without ADD, Adderall also has a remarkable effect. On Erowid, an online forum for recreational drug users and “mind hackers,” there’s post after post of testimonials to the drug’s power to extend focus. “The part of my brain that makes me curious about whether I have new e-mails in my inbox apparently shut down,” author Josh Foer wrote in an article on Slate. “Normally, I can only stare at my computer screen for about 20 minutes at a time. On Adderall, I was able to work in hourlong chunks.”
Читать дальше