At best, if a company knows which articles you read or what mood you’re in, it can serve up ads related to your interests. But at worst, it can make decisions on that basis that negatively affect your life. After you visit a page about Third World backpacking, an insurance company with access to your Web history might decide to increase your premium, law professor Jonathan Zittrain suggests. Parents who purchased EchoMetrix’s Sentry software to track their kids online were outraged when they found that the company was then selling their kids’ data to third-party marketing firms.
Personalization is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life—much of which you might not trust friends with. These companies are getting better at drawing on this data to make decisions every day. But the trust we place in them to handle it with care is not always warranted, and when decisions are made on the basis of this data that affect you negatively, they’re usually not revealed.
Ultimately, the filter bubble can affect your ability to choose how you want to live. To be the author of your life, professor Yochai Benkler argues, you have to be aware of a diverse array of options and lifestyles. When you enter a filter bubble, you’re letting the companies that construct it choose which options you’re aware of. You may think you’re the captain of your own destiny, but personalization can lead you down a road to a kind of informational determinism in which what you’ve clicked on in the past determines what you see next—a Web history you’re doomed to repeat. You can get stuck in a static, ever narrowing version of yourself—an endless you-loop.
And there are broader consequences. In Bowling Alone , his bestselling book on the decline of civic life in America, Robert Putnam looked at the problem of the major decrease in “social capital”—the bonds of trust and allegiance that encourage people to do each other favors, work together to solve common problems, and collaborate. Putnam identified two kinds of social capital: There’s the in-group-oriented “bonding” capital created when you attend a meeting of your college alumni, and then there’s “bridging” capital, which is created at an event like a town meeting when people from lots of different backgrounds come together to meet each other. Bridging capital is potent: Build more of it, and you’re more likely to be able to find that next job or an investor for your small business, because it allows you to tap into lots of different networks for help.
Everybody expected the Internet to be a huge source of bridging capital. Writing at the height of the dot-com bubble, Tom Friedman declared that the Internet would “make us all next door neighbors.” In fact, this idea was the core of his thesis in The Lexus and the Olive Tree : “The Internet is going to be like a huge vise that takes the globalization system… and keeps tightening and tightening that system around everyone, in ways that will only make the world smaller and smaller and faster and faster with each passing day.”
Friedman seemed to have in mind a kind of global village in which kids in Africa and executives in New York would build a community together. But that’s not what’s happening: Our virtual next-door neighbors look more and more like our real-world neighbors, and our real-world neighbors look more and more like us. We’re getting a lot of bonding but very little bridging. And this is important because it’s bridging that creates our sense of the “public”—the space where we address the problems that transcend our niches and narrow self-interests.
We are predisposed to respond to a pretty narrow set of stimuli—if a piece of news is about sex, power, gossip, violence, celebrity, or humor, we are likely to read it first. This is the content that most easily makes it into the filter bubble. It’s easy to push “Like” and increase the visibility of a friend’s post about finishing a marathon or an instructional article about how to make onion soup. It’s harder to push the “Like” button on an article titled, “Darfur sees bloodiest month in two years.” In a personalized world, important but complex or unpleasant issues—the rising prison population, for example, or homelessness—are less likely to come to our attention at all.
As a consumer, it’s hard to argue with blotting out the irrelevant and unlikable. But what is good for consumers is not necessarily good for citizens. What I seem to like may not be what I actually want, let alone what I need to know to be an informed member of my community or country. “It’s a civic virtue to be exposed to things that appear to be outside your interest,” technology journalist Clive Thompson told me. “In a complex world, almost everything affects you—that closes the loop on pecuniary self-interest.” Cultural critic Lee Siegel puts it a different way: “Customers are always right, but people aren’t.”
THE STRUCTURE OFour media affects the character of our society. The printed word is conducive to democratic argument in a way that laboriously copied scrolls aren’t. Television had a profound effect on political life in the twentieth century—from the Kennedy assassination to 9/11—and it’s probably not a coincidence that a nation whose denizens spend thirty-six hours a week watching TV has less time for civic life.
The era of personalization is here, and it’s upending many of our predictions about what the Internet would do. The creators of the Internet envisioned something bigger and more important than a global system for sharing pictures of pets. The manifesto that helped launch the Electronic Frontier Foundation in the early nineties championed a “civilization of Mind in cyberspace”—a kind of worldwide metabrain. But personalized filters sever the synapses in that brain. Without knowing it, we may be giving ourselves a kind of global lobotomy instead.
From megacities to nanotech, we’re creating a global society whose complexity has passed the limits of individual comprehension. The problems we’ll face in the next twenty years—energy shortages, terrorism, climate change, and disease—are enormous in scope. They’re problems that we can only solve together.
Early Internet enthusiasts like Web creator Tim Berners-Lee hoped it would be a new platform for tackling those problems. I believe it still can be—and as you read on, I’ll explain how. But first we need to pull back the curtain—to understand the forces that are taking the Internet in its current, personalized direction. We need to lay bare the bugs in the code—and the coders—that brought personalization to us.
If “code is law,” as Larry Lessig famously declared, it’s important to understand what the new lawmakers are trying to do. We need to understand what the programmers at Google and Facebook believe in. We need to understand the economic and social forces that are driving personalization, some of which are inevitable and some of which are not. And we need to understand what all this means for our politics, our culture, and our future.
Without sitting down next to a friend, it’s hard to tell how the version of Google or Yahoo News that you’re seeing differs from anyone else’s. But because the filter bubble distorts our perception of what’s important, true, and real, it’s critically important to render it visible. That is what this book seeks to do.
If you’re not paying for something, you’re not the customer; you’re the product being sold.
—Andrew Lewis, under the alias Blue_beetle, on the Web site MetaFilter
In the spring of 1994, Nicholas Negroponte sat writing and thinking. At the MIT Media Lab, Negroponte’s brainchild, young chip designers and virtual-reality artists and robot-wranglers were furiously at work building the toys and tools of the future. But Negroponte was mulling over a simpler problem, one that millions of people pondered every day: what to watch on TV.
Читать дальше