Eli Pariser - The Filter Bubble

Здесь есть возможность читать онлайн «Eli Pariser - The Filter Bubble» весь текст электронной книги совершенно бесплатно (целиком полную версию без сокращений). В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Город: New York, Год выпуска: 2011, ISBN: 2011, Издательство: The Penguin Press, Жанр: Публицистика, Интернет, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

The Filter Bubble: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «The Filter Bubble»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

An eye-opening account of how the hidden rise of personalization on the Internet is controlling—and limiting—the information we consume. In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google’s change in policy is symptomatic of the most significant shift to take place on the Web in recent years—the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society—and reveals what we can do about it.
Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook—the primary news source for an increasing number of Americans—prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like
devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.
In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs—and because these filters are invisible, we won’t know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.
While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can—and must—change course. With vivid detail and remarkable scope,
reveals how personalization undermines the Internet’s original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.

The Filter Bubble — читать онлайн бесплатно полную книгу (весь текст) целиком

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «The Filter Bubble», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Ultimately, as Eric Schmidt told Stephen Colbert, Google is just a company. Even if there are ways of addressing these issues that don’t hurt the bottom line—which there may well be—doing so simply isn’t always going to be a top-level priority. As a result, after we’ve each done our part to pop the filter bubble, and after companies have done what they’re willing to do, there’s probably a need for government oversight to ensure that we control our online tools and not the other way around.

In his book Republic.com , Cass Sunstein suggested a kind of “fairness doctrine” for the Internet, in which information aggregators have to expose their audiences to both sides. Though he later changed his mind, the proposal suggests one direction for regulation: Just require curators to behave in a public-oriented way, exposing their readers to diverse lines of argument. I’m skeptical, for some of the same reasons Sunstein abandoned the idea: Curation is a nuanced, dynamic thing, an art as much as a science, and it’s hard to imagine how regulating editorial ethics wouldn’t inhibit a great deal of experimentation, stylistic diversity, and growth.

As this book goes to press, the U.S. Federal Trade Commission is proposing a Do Not Track list, modeled after the highly successful Do Not Call list. At first blush, it sounds pretty good: It would set up a single place to opt out of the online tracking that fuels personalization. But Do Not Track would probably offer a binary choice—either you’re in or you’re out—and services that make money on tracking might simply disable themselves for Do Not Track list members. If most of the Internet goes dark for these people, they’ll quickly leave the list. And as a result, the process could backfire—“proving” that people don’t care about tracking, when in fact what most of us want is more nuanced ways of asserting control.

The best leverage point, in my view, is in requiring companies to give us real control over our personal information. Ironically, although online personalization is relatively new, the principles that ought to support this leverage have been clear for decades. In 1973, the Department of Housing, Education, and Welfare under Nixon recommended that regulation center on what it called Fair Information Practices:

• You should know who has your personal data, what data they have, and how it’s used.

• You should be able to prevent information collected about you for one purpose from being used for others.

• You should be able to correct inaccurate information about you.

• Your data should be secure.

Nearly forty years later, the principles are still basically right, and we’re still waiting for them to be enforced. We can’t wait much longer: In a society with an increasing number of knowledge workers, our personal data and “personal brand” are worth more than they ever have been. Especially if you’re a blogger or a writer, if you make funny videos or music, or if you coach or consult for a living, your online data trail is one of your most valuable assets. But while it’s illegal to use Brad Pitt’s image to sell a watch without his permission, Facebook is free to use your name to sell one to your friends.

In courts around the world, information brokers are pushing this view—“everyone’s better off if your online life is owned by us.” They argue that the opportunities and control that consumers get by using their free tools outweigh the value of their personal data. But consumers are entirely unequipped to make this calculation—while the control you gain is obvious, the control you lose (because, say, your personal data is used to deny you an opportunity down the road) is invisible. The asymmetry of understanding is vast.

To make matters worse, even if you carefully read a company’s privacy policy and decide that giving over rights to your personal information is worth it under those conditions, most companies reserve the right to change the rules of the game at any time. Facebook, for example, promised its users that if they made a connection with a Page, that information would only be shared with their friends. But in 2010, it decided that all of that data should be made fully public; a clause in Facebook’s privacy policy (as with many corporate privacy policies) allows it to change the rules retroactively . In effect, this gives them nearly unlimited power to dispatch personal data as they see fit.

To enforce Fair Information Practices, we need to start thinking of personal data as a kind of personal property and protecting our rights in it. Personalization is based on an economic transaction in which consumers are at an inherent disadvantage: While Google may know how much your race is worth to Google, you don’t. And while the benefits are obvious (free e-mail!), the drawbacks (opportunities and content missed) are invisible. Thinking of personal information as a form of property would help make this a fairer market.

Although personal information is property, it’s a special kind of property, because you still have a vested interest in your own data long after it’s been exposed. You probably wouldn’t want consumers to be able to sell all of their personal data, in perpetuity. France’s “moral laws,” in which artists retain some control over what’s done with a piece after it’s been sold, might be a better template. (Speaking of France, while European laws are much closer to Fair Information Practices in protecting personal information, by many accounts the enforcement is much worse, partly because it’s much harder for individuals to sue for breaches of the laws.)

Marc Rotenberg, executive director of the Electronic Privacy Information Center, says, “We shouldn’t have to accept as a starting point that we can’t have free services on the Internet without major privacy violations.” And this isn’t just about privacy. It’s also about how our data shapes the content and opportunities we see and don’t see. And it’s about being able to track and manage this constellation of data that represents our lives with the same ease that companies like Acxiom and Facebook already do.

Silicon Valley technologists sometimes portray this as an unwinnable fight—people have lost control of their personal data, they’ll never regain it, and they just have to grow up and live with it. But legal requirements on personal information need not be foolproof in order to work, any more than legal requirements not to steal are useless because people sometimes still steal things and get away with it. The force of law adds friction to the transmission of some kinds of information—and in many cases, a little friction changes a lot.

And there are laws that do protect personal information even in this day and age. The Fair Credit Reporting Act, for example, ensures that credit agencies have to disclose their credit reports to consumers and notify consumers when they’re discriminated against on the basis of reports. That’s not much, but given that previously consumers couldn’t even see if their credit report contained errors (and 70 percent do, according to U.S. PIRG), it’s a step in the right direction.

A bigger step would be putting in place an agency to oversee the use of personal information. The EU and most other industrial nations have this kind of oversight, but the United States has lingered behind, scattering responsibilities for protecting personal information among the Federal Trade Commission, the Commerce Department, and other agencies. As we enter the second decade of the twenty-first century, it’s past time to take this concern seriously.

None of this is easy: Private data is a moving target, and the process of balancing consumers and citizens’ interests against those of these companies will take a lot of fine-tuning. At worst, new laws could be more onerous than the practices they seek to prevent. But that’s an argument for doing this right and doing it soon, before the companies who profit from private information have even greater incentives to try to block it from passing.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «The Filter Bubble»

Представляем Вашему вниманию похожие книги на «The Filter Bubble» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «The Filter Bubble»

Обсуждение, отзывы о книге «The Filter Bubble» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x