Regulatory capture is just a small part of the story. In my own profession, journalism, we like to think of ourselves as watchdogs, fierce defenders of the public good. But we, too, are being captured by the industries we’re supposed to keep watch on. There’s journalistic capture just as there’s regulatory capture. It’s most marked in fields such as tech reporting, business reporting, White House reporting—fields where you’re afraid of losing access to your subjects, where you depend on the industry to feed you stories, where your advertising revenue comes from the very people you’re supposed to critique. In all of these fields, you can find numerous reporters who are functionally controlled by the people they’re supposed to keep watch over. Even on my own beat (especially on my own beat!), science reporting, we’re captured. The elaborate system of embargoes and privileged press releases set up by scientific journals and scientific agencies ensures that we report not just what they want but how and when they want it. We’ve unwittingly shifted our allegiance from the public we’re supposed to serve to the people we’re supposed to investigate.
Capture is a bigger threat than even Stigler first realized. Any profession that depends to some degree on objectivity and whose work affects the fortunes of a group of people with power and money is subject to capture. Science, a field in which objectivity is paramount, is far from immune. There’s evidence that medical researchers who take money from industry tend see the natural world in a more positive light: In their experiments, drugs seem to work better, patients seem to survive longer, and side effects seem less dangerous. Yet few scientists, even those taking tens or hundreds of thousands of dollars from drug companies or medical-device manufacturers, think they serve any master but Truth with a capital T. That’s what worries me the most about capture: You never know when you’re a captive.
THE TRIUMPH OF THE VIRTUAL
MIHALY CSIKSZENTMIHALYI
Distinguished Professor of Psychology & Management; founder & codirector, Quality of Life Research Center, Claremont Graduate University; author, A Life Worth Living: Contributions to Positive Psychology
I tried to rank my fears in order of their severity, but soon I realized I would not complete this initial task before the submission deadline, so I decided to use a random number generator to choose among the fears. It turned out not to be a bad choice: basically, the fear that in one or two generations children will grow up to be adults unable to tell reality from imagination. Of course, humanity has always had a precarious hold on reality, but it looks like we’re headed for a quantum leap into an abyss of insubstantiality.
I don’t know if you have been following the launch of the new 3-D version of one of the major multiplayer video games, replete with monsters, orcs, slavering beasts, and all sorts of unsavory characters brandishing lethal weapons. To survive in this milieu, the player needs fast responses and a quick trigger finger. And now let’s reflect on the results of what happens when children start playing such games before entering school and continue to do so into their teens. A child learns about reality through experiences first, not through lectures and books. The incessant warfare he takes part in is not virtual to the child—it is his reality. The events on the screen are more real than the War of Independence or World War II. At a superficial cognitive level they’re aware the game is only a virtual reality, but at a deeper, emotional level they know it is not. After all, it is happening to them.
It’s true that some of the oldest and most popular games are based on forms of mayhem. Chess, for instance, consists in eliminating and immobilizing the enemy forces: infantry, cavalry, messengers, troops mounted on elephants, and the redoubtable queen. (The last was actually a misunderstanding: The Persian inventors of the game gave the most important warrior the title of “Vizier,” after the designation for the commanders of the Persian army; the French Crusaders who learned the game as they wandered through the Near East thought the piece was called Vierge , after the Virgin Mary; upon their return to Europe, the Virgin became the Queen.) But chess, although it can be an obsession, can never be confused with the rest of reality by a sane person. The problem with the new gaming technology is that it has become so realistic that with enough time and with little competition from the child’s environment (which tends to be safe, boring, and predictable), it can erase the distinction between virtual and real. It is then a short step for a young man on the brink of sanity to get ahold of one of the various attack weapons so conveniently available and go on a shooting spree that is a mere continuation of what he has been doing “virtually” for years.
A few decades ago, I started doing research and writing about the impact of indiscriminate television watching, especially on children. Then, as the interactive video games began to appear on the market, it seemed that finally the electronic technology was becoming more child-friendly: Instead of watching passively inane content, children would now have a chance to become engaged in stimulating activities. What I did not have the sense to imagine was that the engagement offered by the new technology would become a Pandora’s box containing bait for the reptilian brain to feast on. What scares me now is that children experiencing such reality are going to create a really real world like the one Hieronymus Bosch envisioned—full of spidery creatures, melting objects, and bestial humans.
NICHOLAS G. CARR
Author, The Shallows: What the Internet Is Doing to Our Brains
I’m concerned about time—the way we’re warping it and it’s warping us. Human beings, like other animals, seem to have remarkably accurate internal clocks. Take away our wristwatches and our cell phones and we can still make pretty good estimates about time intervals. But that faculty can also be easily distorted. Our perception of time is subjective; it changes with our circumstances and our experiences. When things are happening quickly all around us, delays that would otherwise seem brief begin to seem interminable. Seconds stretch out. Minutes go on forever. “Our sense of time,” observed William James in his 1890 masterwork The Principles of Psychology , “seems subject to the law of contrast.”
In a 2009 article in the Philosophical Transactions of the Royal Society, the French psychologists Sylvie Droit-Volet and Sandrine Gil described what they call the paradox of time: “Although humans are able to accurately estimate time as if they possess a specific mechanism that allows them to measure time,” they wrote, “their representations of time are easily distorted by the context.” They describe how our sense of time changes with our emotional state. When we’re agitated or anxious, for example, time seems to crawl; we lose patience. Our social milieu, too, influences the way we experience time. Studies suggest, write Droit-Volet and Gil, “that individuals match their time with that of others.” The “activity rhythm” of those around us alters our own perception of the passing of time.
Given what we know about the variability of our time sense, it seems clear that information and communication technologies would have a particularly strong effect on personal time perception. After all, they often determine the pace of the events we experience, the speed with which we’re presented with new information and stimuli, and even the rhythm of our social interactions. That’s long been true, but the influence must be particularly strong now that we carry powerful and extraordinarily fast computers around with us. Our gadgets train us to expect near instantaneous responses to our actions, and we quickly get frustrated and annoyed at even brief delays.
Читать дальше