Eli Pariser - The Filter Bubble

Здесь есть возможность читать онлайн «Eli Pariser - The Filter Bubble» весь текст электронной книги совершенно бесплатно (целиком полную версию без сокращений). В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Город: New York, Год выпуска: 2011, ISBN: 2011, Издательство: The Penguin Press, Жанр: Публицистика, Интернет, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

The Filter Bubble: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «The Filter Bubble»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

An eye-opening account of how the hidden rise of personalization on the Internet is controlling—and limiting—the information we consume. In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google’s change in policy is symptomatic of the most significant shift to take place on the Web in recent years—the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society—and reveals what we can do about it.
Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook—the primary news source for an increasing number of Americans—prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like
devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.
In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs—and because these filters are invisible, we won’t know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.
While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can—and must—change course. With vivid detail and remarkable scope,
reveals how personalization undermines the Internet’s original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.

The Filter Bubble — читать онлайн бесплатно полную книгу (весь текст) целиком

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «The Filter Bubble», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

The Interactive Advertising Bureau is already pushing in this direction. An industry trade group for the online advertising community, the IAB has concluded that unless personalized ads disclose to users how they’re personalized, consumers will get angry and demand federal regulation. So it’s encouraging its members to include a set of icons on every ad to indicate what personal data the ad draws on and how to change or opt out of this feature set. As content providers incorporate the personalization techniques pioneered by direct marketers and advertisers, they should consider incorporating these safeguards as well.

Even then, sunlight doesn’t solve the problem unless it’s coupled with a focus in these companies on optimizing for different variables: more serendipity, a more humanistic and nuanced sense of identity, and an active promotion of public issues and cultivation of citizenship.

As long as computers lack consciousness, empathy, and intelligence, much will be lost in the gap between our actual selves and the signals that can be rendered into personalized environments. And as I discussed in chapter 5, personalization algorithms can cause identity loops, in which what the code knows about you constructs your media environment, and your media environment helps to shape your future preferences. This is an avoidable problem, but it requires crafting an algorithm that prioritizes “falsifiability,” that is, an algorithm that aims to dis prove its idea of who you are. (If Amazon harbors a hunch that you’re a crime novel reader, for example, it could actively present you with choices from other genres to fill out its sense of who you are.)

Companies that hold great curatorial power also need to do more to cultivate public space and citizenship. To be fair, they’re already doing some of this: Visitors to Facebook on November 2, 2010, were greeted by a banner asking them to indicate if they’d voted. Those who had voted shared this news with their friends; because some people vote because of social pressure, it’s quite possible that Facebook increased the number of voters. Likewise, Google has been doing strong work to make information about polling locations more open and easily available, and featured its tool on its home page on the same day. Whether or not this is profit-seeking behavior (a “find your polling place” feature would presumably be a terrific place for political advertising), both projects drew the attention of users toward political engagement and citizenship.

A number of the engineers and technology journalists I talked to raised their eyebrows when I asked them if personalizing algorithms could do a better job on this front. After all, one said, who’s to say what’s important? For Google engineers to place a value on some kinds of information over others, another suggested, would be unethical—though of course this is precisely what the engineers themselves do all the time.

To be clear, I don’t yearn to go back to the good old days when a small group of all-powerful editors unilaterally decided what was important. Too many actually important stories (the genocide in Rwanda, for example) fell through the cracks, while too many actually unimportant ones got front-page coverage. But I also don’t think we should jettison that approach altogether. Yahoo News suggests there is some possibility for middle ground: The team combines algorithmic personalization with old-school editorial leadership. Some stories are visible to everyone because they’re surpassingly important. Others show up for some users and not others. And while the editorial team at Yahoo spends a lot of time interpreting click data and watching which articles do well and which don’t, they’re not subservient to it. “Our editors think of the audience as people with interests, as opposed to a flood of directional data,” a Yahoo News employee told me. “As much as we love the data, it’s being filtered by human beings who are thinking about what the heck it means. Why didn’t the article on this topic we think is important for our readers to know about do better? How do we help it find a larger audience?”

And then there are fully algorithmic solutions. For example, why not rely on everyone’s idea of what’s important? Imagine for a moment that next to each Like button on Facebook was an Important button. You could tag items with one or the other or both. And Facebook could draw on a mix of both signals—what people like, and what they think really matters—to populate and personalize your news feed. You’d have to bet that news about Pakistan would be seen more often—even accounting for everyone’s quite subjective definition of what really matters. Collaborative filtering doesn’t have to lead to compulsive media: The whole game is in what values the filters seek to pull out. Alternately, Google or Facebook could place a slider bar running from “only stuff I like” to “stuff other people like that I’ll probably hate” at the top of search results and the News Feed, allowing users to set their own balance between tight personalization and a more diverse information flow. This approach would have two benefits: It would make clear that there’s personalization going on, and it would place it more firmly in the user’s control.

There’s one more thing the engineers of the filter bubble can do. They can solve for serendipity, by designing filtering systems to expose people to topics outside their normal experience. This will often be in tension with pure optimization in the short term, because a personalization system with an element of randomness will (by definition) get fewer clicks. But as the problems of personalization become better known, it may be a good move in the long run—consumers may choose systems that are good at introducing them to new topics. Perhaps what we need is a kind of anti-Netflix Prize—a Serendipity Prize for systems that are the best at holding readers’ attention while introducing them to new topics and ideas.

If this shift toward corporate responsibility seems improbable, it’s not without precedent. In the mid-1800s, printing a newspaper was hardly a reputable business. Papers were fiercely partisan and recklessly ideological. They routinely altered facts to suit their owners’ vendettas of the day, or just to add color. It was this culture of crass commercialism and manipulation that Walter Lippmann railed against in Liberty and the News.

But as newspapers became highly profitable and highly important, they began to change. It became possible, in a few big cities, to run papers that weren’t just chasing scandal and sensation—in part, because their owners could afford not to. Courts started to recognize a public interest in journalism and rule accordingly. Consumers started to demand more scrupulous and rigorous editing.

Urged on by Lippmann’s writings, an editorial ethic began to take shape. It was never shared universally or followed as well as it could have been. It was always compromised by the business demands of newspapers’ owners and shareholders. It failed outright repeatedly—access to power brokers compromised truth telling, and the demands of advertisers overcame the demands of readers. But in the end, it succeeded, somehow, in seeing us through a century of turmoil.

The torch is now being passed to a new generation of curators, and we need them to pick it up and carry it with pride. We need programmers who will build public life and citizenship into the worlds they create. And we need users who will hold them to it when the pressure of monetization pulls them in a different direction.

What Governments and Citizens Can Do

There’s plenty that the companies that power the filter bubble can do to mitigate the negative consequences of personalization—the ideas above are just a start. But ultimately, some of these problems are too important to leave in the hands of private actors with profit-seeking motives. That’s where governments come in.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «The Filter Bubble»

Представляем Вашему вниманию похожие книги на «The Filter Bubble» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «The Filter Bubble»

Обсуждение, отзывы о книге «The Filter Bubble» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x