But there are dangers in taking the method too far. As I discussed in chapter 5, the most human acts are often the most unpredictable ones. Because systematizing works much of the time, it’s easy to believe that by reducing and brute-forcing an understanding of any system, you can control it. And as a master of a self-created universe, it’s easy to start to view people as a means to an end, as variables to be manipulated on a mental spreadsheet, rather than as breathing, thinking beings. It’s difficult both to systematize and to appeal to the fullness of human life—its unpredictability, emotionality, and surprising quirks—at the same time.
David Gelernter, a Yale computer scientist, barely survived an encounter with an explosive package sent by the Unabomber; his eyesight and right hand are permanently damaged as a result. But Gelernter is hardly the technological utopian Ted Kaczinski believed him to be.
“When you do something in the public sphere,” Gelernter told a reporter, “it behooves you to know something about what the public sphere is like. How did this country get this way? What was the history of the relationship between technology and the public? What’s the history of political exchange? The problem is, hackers don’t tend to know any of that. And that’s why it worries me to have these people in charge of public policy. Not because they’re bad, just because they’re uneducated.”
Understanding the rules that govern a messy, complex world makes it intelligible and navigable. But systematizing inevitably involves a trade-off—rules give you some control, but you lose nuance and texture, a sense of deeper connection. And when a strict systematizing sensibility entirely shapes social space (as it often does online), the results aren’t always pretty.
The political power of design has long been obvious to urban planners. If you take the Wantagh State Parkway from Westbury to Jones Beach on Long Island, at intervals you’ll pass under several low, vine-covered overpasses. Some of them have as little as nine feet of clearance. Trucks aren’t allowed on the parkway—they wouldn’t fit. This may seem like a design oversight, but it’s not.
There are about two hundred of these low bridges, part of the grand design for the New York region pioneered by Robert Moses. Moses was a master deal maker, a friend of the great politicians of the time, and an unabashed elitist. According to his biographer, Robert A. Caro, Moses’s vision for Jones Beach was as an island getaway for middle-class white families. He included the low bridges to make it harder for low-income (and mostly black) New Yorkers to get to the beach, as public buses—the most common form of transport for inner-city residents—couldn’t clear the overpasses.
The passage in Caro’s The Power Broker describing this logic caught the eye of Langdon Winner, a Rolling Stone reporter, musician, professor, and philosopher of technology. In a pivotal 1980 article titled “Do Artifacts Have Politics?” Winner considered how Moses’s “monumental structures of concrete and steel embody a systematic social inequality, a way of engineering relationships among people that, after a time, became just another part of the landscape.”
On the face of it, a bridge is just a bridge. But often, as Winner points out, architectural and design decisions are underpinned by politics as much as aesthetics. Like goldfish that grow only large enough for the tank they’re in, we’re contextual beings: how we behave is dictated in part by the shape of our environments. Put a playground in a park, and you encourage one kind of use; build a memorial, and you encourage another.
As we spend more of our time in cyberspace—and less of our time in what geeks sometimes call meatspace, or the offline world—Moses’s bridges are worth keeping in mind. The algorithms of Google and Facebook may not be made of steel and concrete, but they regulate our behavior just as effectively. That’s what Larry Lessig, a law professor and one of the early theorists of cyberspace, meant when he famously wrote that “code is law.”
If code is law, software engineers and geeks are the ones who get to write it. And it’s a funny kind of law, created without any judicial system or legislators and enforced nearly perfectly and instantly. Even with antivandalism laws on the books, in the physical world you can still throw a rock through the window of a store you don’t like. You might even get away with it. But if vandalism isn’t part of the design of an online world, it’s simply impossible. Try to throw a rock through a virtual storefront, and you just get an error.
Back in 1980, Winner wrote, “Consciously or unconsciously, deliberately or inadvertently, societies choose structures for technologies that influence how people are going to work, communicate, travel, consume, and so forth over a very long time.” This isn’t to say that today’s designers have malevolent impulses, of course—or even that they’re always explicitly trying to shape society in certain ways. It’s just to say that they can—in fact, they can’t help but shape the worlds they build.
To paraphrase Spider-Man creator Stan Lee, with great power comes great responsibility. But the programmers who brought us the Internet and now the filter bubble aren’t always game to take on that responsibility. The Hacker Jargon File, an online repository of geek culture, puts it this way: “Hackers are far more likely than most non-hackers to either (a) be aggressively apolitical or (b) entertain peculiar or idiosyncratic political ideas.” Too often, the executives of Facebook, Google, and other socially important companies play it coy: They’re social revolutionaries when it suits them and neutral, amoral businessmen when it doesn’t. And both approaches fall short in important ways.
When I first called Google’s PR department, I explained that I wanted to know how Google thought about its enormous curatorial power. What was the code of ethics, I asked, that Google uses to determine what to show to whom? The public affairs manager on the other end of the phone sounded confused. “You mean privacy?” No, I said, I wanted to know how Google thought about its editorial power. “Oh,” he replied, “we’re just trying to give people the most relevant information.” Indeed, he seemed to imply, no ethics were involved or required.
I persisted: If a 9/11 conspiracy theorist searches for “9/11,” was it Google’s job to show him the Popular Mechanics article that debunks his theory or the movie that supports it? Which was more relevant? “I see what you’re getting at,” he said. “It’s an interesting question.” But I never got a clear answer.
Much of the time, as the Jargon File entry claims, engineers resist the idea that their work has moral or political consequences at all. Many engineers see themselves as interested in efficiency and design, in building cool stuff rather than messy ideological disputes and inchoate values. And it’s true that if political consequences of, say, a somewhat faster video-rendering engine exist, they’re pretty obscure.
But at times, this attitude can verge on a “Guns don’t kill people, people do” mentality—a willful blindness to how their design decisions affect the daily lives of millions. That Facebook’s button is named Like prioritizes some kinds of information over others. That Google has moved from PageRank—which is designed to show the societal consensus result—to a mix of PageRank and personalization represents a shift in how Google understands relevance and meaning.
This amorality would be par for the corporate course if it didn’t coincide with sweeping, world-changing rhetoric from the same people and entities. Google’s mission to organize the world’s information and make it accessible to everyone carries a clear moral and even political connotation—a democratic redistribution of knowledge from closed-door elites to the people. Apple’s devices are marketed with the rhetoric of social change and the promise that they’ll revolutionize not only your life but our society as well. (The famous Super Bowl ad announcing the release of the Macintosh computer ends by declaring that “1984 won’t be like 1984. ”)
Читать дальше