I always liked how those town meetings worked. But it wasn’t until I read On Dialogue that I fully understood what they accomplished.
Born to Hungarian and Lithuanian Jewish furniture store owners in Wilkes-Barre, Pennsylvania, David Bohm came from humble roots. But when he arrived at the University of California–Berkeley, he quickly fell in with a small group of theoretical physicists, under the direction of Robert Oppenheimer, who were racing to build the atomic bomb. By the time he died at seventy-two in October 1992, many of his colleagues would remember Bohm as one of the great physicists of the twentieth century.
But if quantum math was his vocation, there was another matter that took up much of Bohm’s time. Bohm was preoccupied with the problems created by advanced civilization, especially the possibility of nuclear war. “Technology keeps on advancing with greater and greater power, either for good or for destruction,” he wrote. “What is the source of all this trouble? I’m saying that the source is basically in thought.” For Bohm, the solution became clear: It was dialogue. In 1992, one of his definitive texts on the subject was published.
To communicate, Bohm wrote, literally means to make something common. And while sometimes this process of making common involves simply sharing a piece of data with a group, more often it involves the group’s coming together to create a new, common meaning. “In dialogue,” he writes, “people are participants in a pool of common meaning.”
Bohm wasn’t the first theorist to see the democratic potential of dialogue. Jurgen Habermas, the dean of media theory for much of the twentieth century, had a similar view. For both, dialogue was special because it provided a way for a group of people to democratically create their culture and to calibrate their ideas in the world. In a way, you couldn’t have a functioning democracy without it.
Bohm saw an additional reason why dialogue was useful: It provided people with a way of getting a sense of the whole shape of a complex system, even the parts that they didn’t directly participate in. Our tendency, Bohm says, is to rip apart and fragment ideas and conversations into bits that have no relation to the whole. He used the example of a watch that has been shattered: Unlike the parts that made up the watch previously, the pieces have no relation to the watch as a whole. They’re just little bits of glass and metal.
It’s this quality that made the Lincolnville town meetings something special. Even if the group couldn’t always agree on where to go, the process helped to develop a shared map for the terrain. The parts understood our relationship to the whole. And that, in turn, makes democratic governance possible.
The town meetings had another benefit: They equipped us to deal more handily with the problems that did emerge. In the science of social mapping, the definition of a community is a set of nodes that are densely interconnected—my friends form a community if they all don’t know just me but also have independent relationships with one another. Communication builds stronger community.
Ultimately, democracy works only if we citizens are capable of thinking beyond our narrow self-interest. But to do so, we need a shared view of the world we cohabit. We need to come into contact with other peoples’ lives and needs and desires. The filter bubble pushes us in the opposite direction—it creates the impression that our narrow self-interest is all that exists. And while this is great for getting people to shop online, it’s not great for getting people to make better decisions together.
“The prime difficulty” of democracy, John Dewey wrote, “is that of discovering the means by which a scattered, mobile, and manifold public may so recognize itself as to define and express its interests.” In the early days of the Internet, this was one of the medium’s great hopes—that it would finally offer a medium whereby whole towns—and indeed countries—could co-create their culture through discourse. Personalization has given us something very different: a public sphere sorted and manipulated by algorithms, fragmented by design, and hostile to dialogue.
Which begs an important question: Why would the engineers who designed these systems want to build them this way?
SOCRATES: Or again, in a ship, if a man having the power to do what he likes, has no intelligence or skill in navigation [αρετης κυβερνητικης, aretēs kybernētikēs ], do you see what will happen to him and to his fellow-sailors?
—Plato,
First Alcibiades , the earliest known use of the word cybernetics
It’s the first fragment of code in the code book, the thing every aspiring programmer learns on day one. In the C++ programming language, it looks like this:
void main()
{
cout << “Hello, World!” <<
endl;
}
Although the code differs from language to language, the result is the same: a single line of text against a stark white screen:
Hello, World!
A god’s greeting to his invention—or perhaps an invention’s greeting to its god. The delight you experience is electric—the current of creation, running through your fingers into the keypad, into the machine, and back out into the world. It’s alive!
That every programmer’s career begins with “Hello, World!” is not a coincidence. It’s the power to create new universes, which is what often draws people to code in the first place. Type in a few lines, or a few thousand, strike a key, and something seems to come to life on your screen—a new space unfolds, a new engine roars. If you’re clever enough, you can make and manipulate anything you can imagine.
“We are as Gods,” wrote futurist Stewart Brand on the cover of his Whole Earth Catalog in 1968, “and we might as well get good at it.” Brand’s catalog, which sprang out of the back-to-the-land movement, was a favorite among California’s emerging class of programmers and computer enthusiasts. In Brand’s view, tools and technologies turned people, normally at the mercy of their environments, into gods in control of them. And the computer was a tool that could become any tool at all.
Brand’s impact on the culture of Silicon Valley and geekdom is hard to overestimate—though he wasn’t a programmer himself, his vision shaped the Silicon Valley worldview. As Fred Turner details in the fascinating From Counterculture to Cyberculture, Brand and his cadre of do-it-yourself futurists were disaffected hippies—social revolutionaries who were uncomfortable with the communes sprouting up in Haight-Ashbury. Rather than seeking to build a new world through political change, which required wading through the messiness of compromise and group decision making, they set out to build a world on their own.
In Hackers, his groundbreaking history of the rise of engineering culture, Steve Levy points out that this ideal spread from the programmers themselves to the users “each time some user flicked the machine on, and the screen came alive with words, thoughts, pictures, and sometimes elaborate worlds built out of air—those computer programs which could make any man (or woman) a god.” (In the era described by Levy’s book, the term hacker didn’t have the transgressive, lawbreaking connotations it acquired later.)
The God impulse is at the root of many creative professions: Artists conjure up color-flecked landscapes, novelists build whole societies on paper. But it’s always clear that these are creations: A painting doesn’t talk back. A program can, and the illusion that it’s “real” is powerful. Eliza, one of the first rudimentary AI programs, was programmed with a battery of therapistlike questions and some basic contextual cues. But students spent hours talking to it about their deepest problems: “I’m having some troubles with my family,” a student might write, and Eliza would immediately respond, “Tell me more about your family.”
Читать дальше