But it’s a response that will halt technological innovation in its tracks, at precisely the time when we need it most.
NEIL GERSHENFELD
Physicist; director, MIT’s Center for Bits & Atoms; author, Fab: The Coming Revolution on Your Desktop—from Personal Computers to Personal Fabrication
Arthur C. Clarke famously observed that “any sufficiently advanced technology is indistinguishable from magic.” That’s what I’m worried about.
2001 has come and long since gone. Once upon a time, we were going to live in a science-fiction future, with flying cars, wrist communicators, and quantum teleporters.
Oh, wait—all of that stuff is here. You can today buy cars with wings, smartphone watches, and, for those so inclined, single-photon sources to make entangled pairs. But the future brought the past with it. We can now create life from scratch and model the global climate, yet battles rage over the teaching of evolution or the human impact on the environment—battles that Darwin or Galileo would recognize as challenges to the validity of the scientific method.
There is a cognitive dissonance in the idea of fundamentalists using satellite phones in their quest for a medieval society, or creationists who don’t believe in evolution getting flu shots based on genetic analysis of seasonal mutations in influenza virus. These advances are linked by workings that are invisible: Deities behave in mysterious ways, and so do cell phones.
We’re in danger of becoming a cargo cult, living with the inventions of ancestors from a mythical time of stable long-term research funding. My word processor is good enough—I don’t need radical innovation in text entry to meet my writing needs. The same may be happening to us as a society. If the technologies already available can provide adequate food, shelter, heat, light, and viral videos of cute kittens, invention is no longer the imperative for survival it once was.
The risk in seeing advanced technology as magic is failing to see where it comes from. The ability to distinguish which is which matters for recognizing the difference between progress and nonsense. We do have magic spells—I’m sure that Gandalf would be impressed by our ability to teach sand to play chess (in the form of silicon transistors), or be terrified by our ability to destroy a city with a lump of (uranium) metal. But these incantations came from building predictive models based on experimental observations, not declarations of beliefs. Accepting the benefits of science without having to accept the methods of science offers us the freedom to ignore inconvenient truths about the environment or the economy or education. Conversely, anyone who has done any kind of technical development has had to confront an external reality that doesn’t conform to personal interpretation.
Rather than seeking to hide the workings of technology, we should seek every opportunity to expose them. The quest for technologies that work like magic is leading to a perverse kind of technical devolution. Mobile operating systems that forbid users from seeing their own file systems; touch interfaces that eliminate the use of fine motor control; cars that prevent owners from accessing maintenance data—these all make it easier to do easy things but harder to do hard things.
The challenges we face as a planet require finding highest rather than lowest common denominators. Learning curves that progress from simple to difficult skills should be sought, not avoided. My understanding is that wizards must train for years to master their spells. Any sufficiently advanced magic is indistinguishable from technology.
DAVID ROWAN
Editor, WIRED U.K.
In a big-data world, it takes an exponentially rising curve of statistics to bring home just how subjugated we now are to the data crunchers’ powers. Each day, according to IBM, we collectively generate 2.5 quintillion bytes—a tsunami of structured and unstructured data that’s growing, in the International Data Corporation’s reckoning, at 60 percent a year. Walmart drags a million hourly retail transactions into a database that long ago passed 2.5 petabytes; Facebook processes 2.5 billion pieces of content and 500 terabytes of data each day; and Google, whose YouTube division alone gains seventy-two hours of new video every minute, accumulates 24 petabytes of data in a single day. No wonder the rock star of Silicon Valley is no longer the genius software engineer but the analytically inclined, ever-more-venerated data scientist.
Certainly there are vast public benefits in the smart processing of these zetta- and yottabytes of previously unconstrained zeroes and ones. Low-cost genomics allows oncologists to target tumors ever more accurately, using the algorithmic magic of personalized medicine; real-time Bayesian analysis lets counterintelligence forces identify the bad guys, or at least attempt to, in new data-mining approaches to fighting terrorism. And let’s not forget the commercial advantages accruing to businesses that turn raw numbers into actionable information: According to the Economist Intelligence Unit, companies that use effective data analytics typically outperform their peers on stock markets by a factor of 250 percent.
Yet as our lives are swept unstoppably into the data-driven world, such benefits are being denied to a fast-emerging data underclass. Any citizen lacking a basic understanding of, and at least minimal access to, the new algorithmic tools will increasingly be disadvantaged in vast areas of economic, political, and social participation. The data-disenfranchised will find it harder to establish personal creditworthiness or political influence; they will be discriminated against by stock markets and social networks. We need to start seeing data literacy as a requisite fundamental skill in a 21st-century democracy, and to campaign—and perhaps even legislate—to protect the interests of those being left behind.
The data-disenfranchised suffer in two main ways. First, they face systemic disadvantages in markets that are nominally open to all. Take stock markets: Any human traders today bold enough to compete against the algorithms of high-frequency and low-latency traders should be made aware of how far the odds are stacked against them. As Andrei Kirilenko, the chief economist at the U.S. Commodity Futures Trading Commission, along with researchers from Princeton and the University of Washington found recently, the most aggressive high-frequency traders tend to make the greatest profits—which suggests that it would be wise for the small investor simply to leave the machines to it. It’s no coincidence that power in a swath of other sectors is accruing to those who control the algorithms—whether the Obama campaign’s electoral “microtargeters” or the yield-raising strategists of data-fueled precision agriculture.
Second, absolute power is accruing to a small number of data superminers whose influence is matched only by their lack of accountability. Your identity is increasingly what the data oligopolists say it is: Credit agencies, employers, prospective dates, even the U.S. National Security Agency have a fixed view of you based on your online data stream as channeled via search engines, social networks, and “influence” scoring sites, however inaccurate or outdated the results. And good luck trying to correct the errors or false impressions that are damaging your prospects. As disenfranchised users of services such as Instagram and Facebook have increasingly come to realize, it’s up to them, not you, as to how your personal data shall be used. The customer may indeed be the product, but there should at least be a duty for such services clearly to inform and educate the customer about his lack of ownership in their digital output.
Читать дальше