The resulting hybrids are more Na’vi-like than human, though they have inherited some human features, such as smaller eyes, five-fingered hands. It remains to be seen whether the genetics will allow avatar-Jake and true-Na’vi Neytiri to have children…
However it’s done, growing an avatar in a tank is one thing. Now we must consider an even harder step: linking Jake Sully’s consciousness to it.
As a concrete example of the challenges involved in establishing a mental link between Jake and his avatar, let’s consider the scene in which avatar-Jake captures his great leonopteryx by falling down from the sky onto its back. As this is happening human-Jake is motionless in his tank. And yet Jake senses everything the avatar senses, and commands every aspect of its conscious movements. He feels the impact as the avatar lands on the creature’s back, feels the surge of acceleration as the indignant leonopteryx flies off.
How could you make this work?
To some extent Jake is like a player of a virtual reality (VR) system, with the “game” being Pandora as a whole. A virtual reality system feeds what is not real into our senses, well enough to enable us to believe that it is real—or at least well enough to suspend our disbelief.
And in some aspects existing systems do this pretty effectively. A music system is a VR system for the ears, fooling you into imagining there’s a rock band or a symphony orchestra in the room with you. The best modern high-fidelity systems have reached such a level of detailed simulation that the ear can’t tell the difference from the reality. For sight, too, watching the movie Avatar itself in 3-D gives you a flavour of what’s possible in delivering a convincing simulation.
So suppose you constructed an “avatar” like a high-tech robot, laden with cameras, microphones and other sensors. Jake meanwhile is in a wraparound suit with earphones, goggles and maybe with sense-stimulating plugs in his nose and mouth. He is in a motion-capture system of the type Quaritch uses to control his AMP suit, with the machine’s motions aping his own body’s gestures—or like the modern Wii game system. As the leonopteryx looms below the falling robot, you could imagine an all but perfect sensory simulation of the experience being relayed to Jake by all the little cameras and microphones and other sensors: he smells the leonopteryx’s leathery stink, an aroma simulated in some miniature chemical factory, and feels the rushing air of Pandora in his face, blown by tiny fans.
But this is a simulation which would end in dismal disappointment as soon as the robot hit the back of the animal with a shuddering crash—and Jake felt nothing of the impact.
Oh, you could provide human-Jake in his tank with some token jolt, like the little bumps you get in a fairground-ride flight simulator. But here we’ve reached the limit of modern VR technology. We don’t know any way to build systems external to the body to simulate the inner sense of the sharp deceleration that ends a fall, or indeed the acceleration that comes with a rocket launch, say. That’s why astronauts train for zero gravity by floating around in tanks of water, or in planes which make powered falls to provide the illusion of zero gravity for a few seconds: “Vomit Comets.”
You can list plenty of other “inner” sensations Jake needs to experience fully the avatar’s reality. He could be made to feel the Pandoran fruit in his hand, he could taste the juice in the avatar’s mouth—but how could he be made to feel hungry , when the avatar is hungry?
External VR systems of the kind we have today won’t be sufficient. Just as we see in the movie, it is necessary to hack into Jake’s brain to make this work.
In the link room we see Jake, preparing to drive his avatar, lie down in a “psionic link unit.” This has an architecture that looks similar to a modern medical scanner, like a magnetic resonance imager. With this, Max Patel and Grace Augustine are able to extract a three-dimensional image of Jake’s brain, complete with ongoing neural activity.
Then a data link is established between Jake’s brain and the avatar’s, as evidenced by similar-looking images in the scans. The techs speak of achieving “congruency,” as the brains are mapped one to the other. In mathematics, congruent triangles are the same shape and size; you could cut them out and overlay them exactly, though you might have to turn one over to do it. The word is also used in psychology to mean internal and external consistency of the mind. Ultimately “phase lock” is established between the two nervous systems.
What is happening is that the technology is hacking into the input-output systems of Jake’s brain. When he’s outside the link unit, Jake’s brain is connected to his body by a set of neural connections. Sensory information comes flowing into the brain through these connections, and Jake’s commands for his body—lift that arm, jump from that banshee—flow out of his brain. What the link technology has to do is hack into this flow of data, and into the similar flow of data in and out of the avatar’s brain. Sensory input coming in from Jake’s own body must be ignored, and replaced with the data flowing from the avatar’s body. Similarly Jake’s motor-control commands must be diverted from his own body, and transmitted to the avatar body. And all this is done “non-invasively,” in the jargon; the scanning machine manages all this without the need to stick wires into Jake’s skull.
This resolves the problem of inner sensation. It’s as if Jake’s brain has been physically implanted in the avatar’s body. Signals arising from the avatar’s inner proprioceptive senses of falling and then slamming to a halt aboard the leonopteryx are now sent direct to Jake’s brain, so that he “feels” the impact in a way he never could using an external suit.
So that’s the principle. What about the practice? Is this feasible?
Something like the avatar-link process has been studied in the context of “neuroinformatics.” “Mind uploading” is the process of scanning and mapping a biological brain in detail and transferring that data to a computer, or another machine. Clearly this is like half of an avatar link, with a link to a computer store rather than directly to another brain. And it is like the fate of Grace Augustine, when as her human body dies she passes through the “Eye of Eywa,” to become one with the Great Mother—that is, her consciousness is stored in Pandora’s great biological computer. (In this case Eywa was meant to be used as a temporary buffer; Grace’s mind was supposed to return through the Eye of Eywa and then enter her avatar body.)
We have taken some baby steps towards this kind of technology today. In “neuroprosthetics” the nervous system is connected directly to some device. And through a “brain–computer interface” (BCI—a variant is BMI, for brain–machine interface) the brain itself is connected to a computer. Researches in the field began in earnest in the 1970s at the University of California, where the term BCI was first coined.
The first neuroprosthetic applications have been medical, with the aim being the repair of damaged human sensory or motor functions. There have been some attempts to use this technology as an alternative way to treat spinal injuries, like Jake Sully’s. A non-profit consortium called the Walk Again Project has a five-year goal to help a quadriplegic paralysed by a spinal injury to walk again; the patient would use neuroprosthetic devices to control an exoskeleton, an interface reading control signals from the brain to pass to the hardware. The current leading BCI technology is called BrainGate, in which an array of microelectrodes is implanted in the primary motor centre of the brain. In 2008 researchers at the Pittsburgh Medical Center were able to show a monkey operating a robotic arm, with the relevant data being read from the animal’s brain with an invasive implant.
Читать дальше