“It’s not impossible, Phillips,” Maxon answered. “The world is only electrical and magnetic.”
“Okay,” said Phillips. “So why not?”
“You don’t understand,” said Maxon. “It is all electricity. So the question is really: Why?”
“I am not following you, Genius,” said Phillips. “You’re making it sound easy, and then acting like it’s hard.”
The machine began to spin them. At first, it was slow.
“Can it, Lieutenant. Shut up, Dr. Mann,” said Gompers, always quick to remind him that he did not have a military title. But Maxon was already talking.
“Listen. From the smallest, deepest synapses in the human brain to the interactions of galaxies with the universe, it is all electricity. If you can shape the force of electricity, you can duplicate any other impulse in the world. A robot can yawn, it can desire, it can climax. It can do exactly what a human does, in exactly the same way. You really want a robot to love you? You want it to fuck you back, when you fuck it? Just like a woman? Let me tell you: There is no difference between carbon and steel, between water and ooze. With a number of conditional statements nearing infinity, any choice can be replicated, however random. The only hard thing about creating more sophisticated AI was acquiring the space needed to hold such a myriad of possibilities. There is nothing different in a human’s brain from a robot’s brain. Not one single thing.”
By this time the machine was spinning so fast, his cheeks were flapping. The other men in the module were quiet, intense. Their eyes were all open. Their faces looked skeletal, all the skin pulled back.
“GET IT?” Maxon screeched.
And even in the pressure of all that simulated gravity, Fred Phillips found it possible to roll his eyes.
When the machine stopped, Phillips said, “Mann, dude, I feel for your wife.”
“What do you feel for her?” said Maxon.
* * *
WHY DID THE ROBOTS not love? Why not feel good about themselves, just for once? Why not prefer one entity, one electrical epicenter, over all the others, for no other reason than that it felt good to do so? Maxon knew why. They could not love because he had not made them love. He had not made them love because he didn’t understand why they should love. He didn’t understand why he should love, why anyone should love. It wasn’t logical. It wasn’t rational, because it wasn’t beneficial. That was the truth of the matter. He chose for them not to, because loving defied his central principle: If humans do it, it must be right.
To show preference only for a good reason, to accept any choice made with the best use of available information, to suspect a source of giving incorrect data when incorrect data had been received from it in the past; these responses were beneficial to the robot, to the human. To love for no reason, to grieve over a choice that had been made rationally, to forgive, to show mercy, to trust a poison well, also potentially damaging. If humans do it, why do they do it?
He understood the value of a mother’s love for her child. That had a use. He understood the value of a soldier’s love for his brother-in-arms. That had a use. But the family structure was so integral to the foundation of a civilization, and the solidity of the family was so important to the civilization’s survival, that choosing a mate based on some ridiculous whim seemed insane. It seemed destructive. How could it be so? Yet he, Maxon Mann, gearshrinker, droidmaster, having decided that all romantic love is at odds with the survival of the species, had fallen, himself, in love. He had fallen deeply, hopelessly, inexorably in love with Sunny, and it had happened almost before he got started in life. Over seven thousand rotations of the Earth ago. Certainly before he understood the ramifications of his electrobiological behavior.
That night, his second night in space, the feeling of breathing in was almost crushing him, the quarters so close that taking a deep breath almost had his bony chest brushing up against the shelf that held his laptop, his mission log, stuck down with Velcro. He let his head roll back against the wall, his crisp curls brushing the back of his neck. One hand went up to cover his eyes, the other hand still held the pen, poised over those three words; love, regret, forgive. When he finally slept, lulled by a cyclical computation worked out on the back of his eyelids, the pen went scratching across the paper, one final subconscious underscore. First there was Asimov, and his fictional laws of robotics, all written to protect humanity from the AI they’d created. Then Ito’s laws, excusing the failure of programmers who wouldn’t dare to try to re-create a human mind. Now Maxon’s laws, because he was the only one left with the stones to know when to stop pushing the buttons that he himself had wired. Maxon Mann’s Three Laws of Robotics: A robot cannot love. A robot cannot regret. A robot cannot forgive.
The contractions stopped. Fluids were drained into her. She went home. Night came and everyone slept. Morning came and the nanny took Bubber off to preschool. He went out the door with his head pointed forward, wearing his helmet, with a snack and emergency pants in a horse-shaped backpack he called “Word.” Sunny was supposed to lie down as much as possible, so she did. She lay down in her pumpkin-colored bedroom. She put her bald head down on the embroidered silk duvet cover, so carefully joined in color and historical context to the weird footstool she’d found at an estate sale, which was itself so carefully coordinated with the Morris chair in the corner, by the pumpkin-shaded light. The theme of her bedroom decor flowed around the space like a gentle ellipse through a series of perfectly oriented points. Not one curtain rod, not one shoe tree, not one alarm clock fell off the graph. On the TV, the NASA channel was playing without sound. But Maxon was not on the screen.
She fell asleep and dreamed of a matrix of all possible babies that she could be carrying at that moment. The possible babies spread out over a three-dimensional cube. At point zero, zero, zero was a normal human male baby, looking exactly like Maxon. Tall, mad-eyed, long-limbed, and pale. From there, the change in babies radiated out along a three-dimensional grid through the whole volume of the cube. At the intersection of every line was another scrawny infant, crouched and curled, naked and wrinkled. Eye color, hair color, pianist hands, knobby legs, short neck. Along this axis, more and more freckles. Along that axis, more and more hair. Of course, there cannot be an incremental change in gender. So, all alone, the baby at the opposite point of the cube, with her large alien eyes and her bald alien head, and her padded fingers and short legs, was the only female. She rotated like the other babies, but in the opposite direction. Already different.
The phone rang, waking her up. It was the director of the school.
“Mrs. Mann,” he said, “I would love for you to spend some time here with us when you pick up Bubber today. I have arranged a meeting with our staff psychiatrist for you.”
He didn’t know about the car accident, because she hadn’t told him. He didn’t know about the wig.
She sat up, held up the phone firmly to her head, and said, “No.”
“Mrs. Mann,” he droned on, “Bubber has had a meltdown this morning. Now he is back in his helmet, and we are all fine. There is no need to worry. But the behaviors we are seeing are becoming prohibitive.”
“What do they prohibit?” asked Sunny.
“With respect, Mrs. Mann, we have an extraordinary facility and many resources,” he said, “yet we cannot quite account for Bubber and his behavior.”
“But I thought Miss Mary had been working with him.” Sunny had spent a lot of money on the school. Miss Mary was one of many resources.
Читать дальше