Similarly, in 2006, with Bertrand Thirion, we tested the theory that the visual areas of the cortex act as an internal visual blackboard where mental images get projected. Indeed, by measuring their activity, we managed to decode the rough shape of what a person had seen, and even of what she had imagined in her mind’s eye, in full darkness. [k] “Inverse retinotopy: Inferring the visual content of images from brain activation patterns,” NeuroImage 33:4, 1104-16 (2006).
Jack Gallant, at Berkeley, later improved this technique to the point of decoding entire movies from the traces they evoke in the cortex. His reconstruction of the coarse contents of a film, as deduced by monitoring the spectator’s brain, was an instant YouTube hit.
Why, then, do I refuse to worry that the CIA could harness these techniques to monitor my thoughts? Because many limitations still hamper their practical application in everyday circumstances. First of all, they require a ten-ton superconducting MR magnet filled with liquid helium—an unlikely addition to airport security portals. Furthermore, functional MRI works only with a cooperative volunteer who stays perfectly still and attends to the protocol; total immobility is a must. Even a millimeter of head motion, especially if it occurs in tight correlation with the scanning protocol, can ruin a brain scan. In the unlikely event that you are scanned against your will, rolling your eyes rhythmically or moving your head ever so slightly in sync with the stimuli may suffice to prevent detection. In the case of an electroencephalogram, clenching your teeth will go a long way. And systematically thinking of something else will, of course, disrupt the decoding.
Finally, there are limitations arising from the nature of the neural code. MRI samples brain activity on a coarse spatial scale and in an indirect manner. Every millimeter-sized pixel in a brain scan averages over the activity of hundreds of thousands of neurons. Yet the precise neural code that contains our detailed thoughts presumably lies in the fast timing of individual spikes from thousands of intermingled neurons—microscopic events we cannot see without opening the skull. In truth, even if we did, the exact way in which thoughts are encoded still escapes us. Crucially, neuroscience is lacking even the inkling of a theory as to how the complex combinatorial ideas afforded by the syntax of language are encrypted in neural networks. Until we do, we have very little chance of decoding nested thoughts such as “I think that X…,” “My neighbor thinks that X…,” “I used to believe that X…” “He thinks that I think that X…,” “It is not true that X…,” and so on.
There’s no guarantee, of course, that these problems will not be solved—next week or next century, perhaps using electronic implants or miniaturized electromagnetic recording devices. Should we worry then? Millions of people will rejoice instead. They are the many patients with brain lesions whose lives may soon change thanks to brain technologies. In a motivated patient, decoding the intention to move an arm is far from impossible, and it may allow a quadriplegic to regain his or her autonomy, for instance by controlling a computer mouse or a robotic arm. My laboratory is currently working on an EEG-based device that decrypts the residual brain activity of patients in a coma or vegetative state and helps doctors decide whether consciousness is present or will soon return. Such valuable medical applications are the future of brain imaging, not the devilish sci-fi devices that we wrongly worry about.
ANTON ZEILINGER
Physicist, University of Vienna; scientific director, Institute of Quantum Optics & Quantum Information, Austrian Academy of Sciences; author, Dance of the Photons: From Einstein to Quantum Teleportation
What I worry most about is that we are increasingly losing the formal and informal bridges between different intellectual, mental, and humanistic approaches to seeing the world.
Consider Europe in the first third of the 20th century. Vienna at that time was a hotspot for art, science, literature, music, psychology, and many other disciplines. Johannes Brahms, for example, gave music lessons to the Wittgenstein family, and the Vienna Circle of logical positivism, created by mathematicians and philosophers, gave all of us a new way to look at some of the most fundamental questions.
Another example is Erwin Schrödinger, the founder of wave mechanics. He writes in his autobiographical notes of how he nearly became professor of physics in Czernowitz in the Bukowina, today’s Ukraine. There he would have had to teach physics to engineers, and that, he writes, would have given him a lot of spare time to devote to philosophy.
In today’s world, all these activities—scientific, artistic, whatever—have been compartmentalized to an unprecedented degree. There are fewer and fewer people able to bridge the many gaps. Fields of expertise and activity become narrower and narrower; new gaps open all the time. Part of the cause is certainly the growth of the Internet, which typically provides immediate answers to small questions; the narrower the question, the better the answer. Deep analysis is an endeavor that by its very essence is entirely different from browsing the Web.
I worry that this trend—this narrowing—will continue. And I worry that in the end we will lose significant parts of our cultural heritage and therefore our very identity as humans.
C. P. SNOW’S TWO CULTURES AND THE NATURE-NURTURE DEBATE
SIMON BARON-COHEN
Psychologist, Autism Research Centre, Cambridge University; author, The Science of Evil: On Empathy and the Origins of Cruelty
More than fifty years have passed since C. P. Snow gave the Rede Lecture in the Senate House at Cambridge University. It was May 7, 1959, when he aired his worry that the majority of his colleagues in the humanities were scientifically illiterate and the majority of his colleagues who were scientists were disinterested in literature. His worry was that two cultures had emerged and were less and less able to understand each other. By way of graphic illustration, Snow argued that scientists would struggle to read a Charles Dickens novel and most humanities professors would be unable to state the second law of thermodynamics. “So the great edifice of modern physics goes up,” he declared, “and the majority of the cleverest people in the Western world have about as much insight into it as their neolithic ancestors would have had.”
Snow was by training a scientist, who turned his hand to writing novels, exemplifying that rare breed of person who attempts to straddle both cultures. In 1962, Cambridge professor of literature F. R. Leavis scathingly wrote of Snow’s lack of ability as a novelist, in an effort to rubbish his “two cultures” argument. Leavis’s attack was rightly dismissed as ad hominem . But was Snow correct?
If he was, then given the remarkable rate of progress in science over the last fifty years, the gulf between these two cultures may have widened. On the other hand, through the efforts of John Brockman and other literary agents and publishers who have encouraged scientists to communicate to the wider public, creating the so-called third culture, science is now very accessible to nonscientists. So has the gap between Snow’s two cultures become wider or narrower?
I think the answer is both. The gap has narrowed thanks to wonderful books like Steven Pinker’s The Language Instinct. It should now be virtually impossible for a linguist to see language as just a product of culture instead of also a product of our genes. Pinker’s book epitomizes what the third culture should be like, illustrating the complex interplay between biology and culture in producing human behavior. Scientists find the idea of a biology/culture interaction unsurprising, almost truistic. As a psychologist, I can think of few if any examples of human behavior that are entirely the result of culture, and I assume that most people interested in human behavior adopt the same moderate position of acknowledging a biology/environment interaction. To be a hard-core biological determinist or a hard-core social determinist seems extreme.
Читать дальше