Eli Pariser - The Filter Bubble

Здесь есть возможность читать онлайн «Eli Pariser - The Filter Bubble» весь текст электронной книги совершенно бесплатно (целиком полную версию без сокращений). В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Город: New York, Год выпуска: 2011, ISBN: 2011, Издательство: The Penguin Press, Жанр: Публицистика, Интернет, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

The Filter Bubble: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «The Filter Bubble»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

An eye-opening account of how the hidden rise of personalization on the Internet is controlling—and limiting—the information we consume. In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google’s change in policy is symptomatic of the most significant shift to take place on the Web in recent years—the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society—and reveals what we can do about it.
Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook—the primary news source for an increasing number of Americans—prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like
devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.
In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs—and because these filters are invisible, we won’t know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.
While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can—and must—change course. With vivid detail and remarkable scope,
reveals how personalization undermines the Internet’s original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.

The Filter Bubble — читать онлайн бесплатно полную книгу (весь текст) целиком

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «The Filter Bubble», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

We live in an increasingly algorithmic society, where our public functions, from police databases to energy grids to schools, run on code. We need to recognize that societal values about justice, freedom, and opportunity are embedded in how code is written and what it solves for. Once we understand that, we can begin to figure out which variables we care about and imagine how we might solve for something different.

For example, advocates looking to solve the problem of political gerrymandering—the backroom process of carving up electoral districts to favor one party or another—have long suggested that we replace the politicians involved with software. It sounds pretty good: Start with some basic principles, input population data, and out pops a new political map. But it doesn’t necessarily solve the basic problem, because what the algorithm solves for has political consequences: Whether the software aims to group by cities or ethnic groups or natural boundaries can determine which party keeps its seats in Congress and which doesn’t. And if the public doesn’t pay close attention to what the algorithm is doing, it could have the opposite of the intended effect—sanctioning a partisan deal with the imprimatur of “neutral” code.

In other words, it’s becoming more important to develop a basic level of algorithmic literacy. Increasingly, citizens will have to pass judgment on programmed systems that affect our public and national life. And even if you’re not fluent enough to read through thousands of lines of code, the building-block concepts—how to wrangle variables, loops, and memory—can illuminate how these systems work and where they might make errors.

Especially at the beginning, learning the basics of programming is even more rewarding than learning a foreign language. With a few hours and a basic platform, you can have that “Hello, World!” experience and start to see your ideas come alive. And within a few weeks, you can be sharing these ideas with the whole Web. Mastery, as in any profession, takes much longer, but the payoff for a limited investment in coding is fairly large: It doesn’t take long to become literate enough to understand what most basic bits of code are doing.

Changing our own behavior is a part of the process of bursting the filter bubble. But it’s of limited use unless the companies that are propelling personalization forward change as well.

What Companies Can Do

It’s understandable that, given their meteoric rises, the Googles and Facebooks of the online world have been slow to realize their responsibilities. But it’s critical that they recognize their public responsibility soon. It’s no longer sufficient to say that the personalized Internet is just a function of relevance-seeking machines doing their job.

The new filterers can start by making their filtering systems more transparent to the public, so that it’s possible to have a discussion about how they’re exercising their responsibilities in the first place.

As Larry Lessig says, “A political response is possible only when regulation is transparent.” And there’s more than a little irony in the fact that companies whose public ideologies revolve around openness and transparency are so opaque themselves.

Facebook, Google, and their filtering brethren claim that to reveal anything about their algorithmic processes would be to give away business secrets. But that defense is less convincing than it sounds at first. Both companies’ primary advantage lies in the extraordinary number of people who trust them and use their services (remember lock-in?). According to Danny Sullivan’s Search Engine Land blog, Bing’s search results are “highly competitive” with Google’s, but it has a fraction of its more powerful rival’s users. It’s not a matter of math that keeps Google ahead, but the sheer number of people who use it every day. PageRank and the other major pieces of Google’s search engine are “actually one of the world’s worst kept secrets,” says Google fellow Amit Singhal.

Google has also argued that it needs to keep its search algorithm under tight wraps because if it was known it’d be easier to game. But open systems are harder to game than closed ones, precisely because everyone shares an interest in closing loopholes. The open-source operating system Linux, for example, is actually more secure and harder to penetrate with a virus than closed ones like Microsoft’s Windows or Apple’s OS X.

Whether or not it makes the filterers’ products more secure or efficient, keeping the code under tight wraps does do one thing: It shields these companies from accountability for the decisions they’re making, because the decisions are difficult to see from the outside. But even if full transparency proves impossible, it’s possible for these companies to shed more light on how they approach sorting and filtering problems.

For one thing, Google and Facebook and other new media giants could draw inspiration from the history of newspaper ombudsmen, which became a newsroom topic in the mid-1960s.

Philip Foisie, an executive at the Washington Post company, wrote one of the most memorable memos arguing for the practice. “It is not enough to say,” he suggested, “that our paper, as it appears each morning, is its own credo, that ultimately we are our own ombudsman. It has not proven to be, possibly cannot be. Even if it were, it would not be viewed as such. It is too much to ask the reader to believe that we are capable of being honest and objective about ourselves.” The Post found his argument compelling, and hired its first ombudsman in 1970.

“We know the media is a great dichotomy,” said the longtime Sacramento Bee ombudsman Arthur Nauman in a speech in 1994. On the one hand, he said, media has to operate as a successful business that provides a return on investment. “But on the other hand, it is a public trust, a kind of public utility. It is an institution invested with enormous power in the community, the power to affect thoughts and actions by the way it covers the news—the power to hurt or help the common good.” It is this spirit that the new media would do well to channel. Appointing an independent ombudsman and giving the world more insight into how the powerful filtering algorithms work would be an important first step.

Transparency doesn’t mean only that the guts of a system are available for public view. As the Twitter versus Facebook dichotomy demonstrates, it also means that individual users intuitively understand how the system works. And that’s a necessary precondition for people to control and use these tools—rather than having the tools control and use us.

To start with, we ought to be able to get a better sense of who these sites think we are. Google claims to make this possible with a “dashboard”—a single place to monitor and manage all of this data. In practice, its confusing and multitiered design makes it almost impossible for an average user to navigate and understand. Facebook, Amazon, and other companies don’t allow users to download a complete compilation of their data in the United States, though privacy laws in Europe force them to. It’s an entirely reasonable expectation that data that users provide to companies ought to be available to us, and that this expectation is one that, according to the University of California at Berkeley, most Americans share. We ought to be able to say, “You’re wrong. Perhaps I used to be a surfer, or a fan of comics, or a Democrat, but I’m not any more.”

Knowing what information the personalizers have on us isn’t enough. They also need to do a much better job explaining how they use the data—what bits of information are personalized, to what degree, and on what basis. A visitor to a personalized news site could be given the option of seeing how many other visitors were seeing which articles—even perhaps a colorcoded visual map of the areas of commonality and divergence. Of course, this requires admitting to the user that personalization is happening in the first place, and there are strong reasons in some cases for businesses not to do so. But they’re mostly commercial reasons, not ethical ones.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «The Filter Bubble»

Представляем Вашему вниманию похожие книги на «The Filter Bubble» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «The Filter Bubble»

Обсуждение, отзывы о книге «The Filter Bubble» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x