Eli Pariser - The Filter Bubble

Здесь есть возможность читать онлайн «Eli Pariser - The Filter Bubble» весь текст электронной книги совершенно бесплатно (целиком полную версию без сокращений). В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Город: New York, Год выпуска: 2011, ISBN: 2011, Издательство: The Penguin Press, Жанр: Публицистика, Интернет, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

The Filter Bubble: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «The Filter Bubble»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

An eye-opening account of how the hidden rise of personalization on the Internet is controlling—and limiting—the information we consume. In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google’s change in policy is symptomatic of the most significant shift to take place on the Web in recent years—the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society—and reveals what we can do about it.
Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook—the primary news source for an increasing number of Americans—prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like
devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.
In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs—and because these filters are invisible, we won’t know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.
While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can—and must—change course. With vivid detail and remarkable scope,
reveals how personalization undermines the Internet’s original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.

The Filter Bubble — читать онлайн бесплатно полную книгу (весь текст) целиком

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «The Filter Bubble», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Looking into the future, Coyne told me, you’ll have people walking around with augmented displays. He described a guy on a night out: You walk into a bar, and a camera immediately scans the faces in the room and matches them against OkCupid’s databases. “Your accessories can say, that girl over there is an eighty-eight percent match. That’s a dream come true!”

Vladimir Nabokov once commented that “reality” is “one of the few words that mean nothing without quotes.” Coyne’s vision may soon be our “reality.” There’s tremendous promise in this vision: Surgeons who never miss a suture, soldiers who never imperil civilians, and everywhere a more informed, information-dense world. But there’s also danger: Augmented reality represents the end of naive empiricism, of the world as we see it, and the beginning of something far more mutable and weird: a real-world filter bubble that will be increasingly difficult to escape.

Losing Control

There’s plenty to love about this ubiquitously personalized future.

Smart devices, from vacuum cleaners to lightbulbs to picture frames, offer the promise that our environments will be exactly the way we want them, wherever we are. In the near future, ambient-intelligence expert David Wright suggests, we might even carry our room-lighting preferences with us; when there are multiple people in a room, a consensus could be automatically reached by averaging preferences and weighting for who’s the host.

AugCog-enabled devices will help us track the data streams that we consider most important. In some situations—say, medical or fire alerts that find ways to escalate until they capture our attention—they could save lives. And while brainwave-reading AugCog is probably some way off for the masses, consumer variants of the basic concept are already being put into place. Google’s Gmail Priority Inbox, which screens e-mails and highlights the ones it assesses as more important, is an early riff on the theme. Meanwhile, augmented-reality filters offer the possibility of an annotated and hyperlinked reality, in which what we see is infused with information that allows us to work better, assimilate information more quickly, and make better decisions.

That’s the good side. But there’s always a bargain in personalization: In exchange for convenience, you hand over some privacy and control to the machine.

As personal data become more and more valuable, the behavioral data market described in chapter 1 is likely to explode. When a clothing company determines that knowing your favorite color produces a $5 increase in sales, it has an economic basis for pricing that data point—and for other Web sites to find reasons to ask you. (While OkCupid is mum about its business model, it likely rests on offering advertisers the ability to target its users based on the hundreds of personal questions they answer.)

While many of these data acquisitions will be legitimate, some won’t be. Data are uniquely suited to gray-market activities, because they need not carry any trace of where they have come from or where they have been along the way. Wright calls this data laundering, and it’s already well under way: Spyware and spam companies sell questionably derived data to middlemen, who then add it to the databases powering the marketing campaigns of major corporations.

Moreover, because the transformations applied to your data are often opaque, it’s not always clear exactly what decisions are being made on your behalf, by whom, or to what end. This matters plenty when we’re talking about information streams, but it matters even more when this power is infused into our sensory apparatus itself.

In 2000, Bill Joy, the Sun Microsystems cofounder, wrote a piece for Wired magazine titled “Why the Future Doesn’t Need Us.” “As society and the problems that face it become more and more complex and machines become more and more intelligent,” he wrote, “people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones.”

That may often be the case: Machine-driven systems do provide significant value. The whole promise of these technologies is that they give us more freedom and more control over our world—lights that respond to our whims and moods, screens and overlays that allow us to attend only to the people we want to, so that we don’t have to do the busywork of living. The irony is that they offer this freedom and control by taking it away. It’s one thing when a remote control’s array of buttons elides our ability to do something basic like flip the channels. It’s another thing when what the remote controls is our lives.

It’s fair to guess that the technology of the future will work about as well as the technology of the past—which is to say, well enough, but not perfectly. There will be bugs. There will be dislocations and annoyances. There will be breakdowns that cause us to question whether the whole system was worth it in the first place. And we’ll live with the threat that systems made to support us will be turned against us—that a clever hacker who cracks the baby monitor now has a surveillance device, that someone who can interfere with what we see can expose us to danger. The more power we have over our own environments, the more power someone who assumes the controls has over us.

That is why it’s worth keeping the basic logic of these systems in mind: You don’t get to create your world on your own. You live in an equilibrium between your own desires and what the market will bear. And while in many cases this provides for healthier, happier lives, it also provides for the commercialization of everything—even of our sensory apparatus itself. There are few things uglier to contemplate than AugCog-enabled ads that escalate until they seize control of your attention.

We’re compelled to return to Jaron Lanier’s question: For whom do these technologies work? If history is any guide, we may not be the primary customer. And as technology gets better and better at directing our attention, we need to watch closely what it is directing our attention toward.

8

Escape from the City of Ghettos

In order to find his own self, [a person] also needs to live in a milieu where the possibility of many different value systems is explicitly recognized and honored. More specifically, he needs a great variety of choices so that he is not misled about the nature of his own person.

—Christopher Alexander et al., A Pattern Language

In theory, there’s never been a structure more capable of allowing all of us to shoulder the responsibility for understanding and managing our world than the Internet. But in practice, the Internet is headed in a different direction. Sir Tim Berners-Lee, the creator of the World Wide Web, captured the gravity of this threat in a recent call to arms in the pages of Scientific American titled “Long Live the Web.” “The Web as we know it,” he wrote, “is being threatened…. Some of its most successful inhabitants have begun to chip away at its principles. Large social-networking sites are walling off information posted by their users from the rest of the Web…. Governments—totalitarian and democratic alike—are monitoring people’s online habits, endangering important human rights. If we, the Web’s users, allow these and other trends to proceed unchecked, the Web could be broken into fragmented islands.”

In this book, I’ve argued that the rise of pervasive, embedded filtering is changing the way we experience the Internet and ultimately the world. At the center of this transformation is the fact that for the first time it’s possible for a medium to figure out who you are, what you like, and what you want. Even if the personalizing code isn’t always spot-on, it’s accurate enough to be profitable, not just by delivering better ads but also by adjusting the substance of what we read, see, and hear.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «The Filter Bubble»

Представляем Вашему вниманию похожие книги на «The Filter Bubble» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «The Filter Bubble»

Обсуждение, отзывы о книге «The Filter Bubble» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x