Suddenly she felt a little sick, but she forced herself to keep all hint of that out of her voice and expression. She told herself she was imposing human attributes on Kaelor, investing him with characteristics and emotions he simply did not have. There was no practical difference between having him up on that rack and having a malfunctioning aircar up on a hydraulic lift in a repair shop. She told herself all of that, and more, but she did not believe a word of it. She forced herself to look steadily, coolly, at Kaelor, and she addressed him. “Kaelor, do you know who I am?”
“Yes, of course. You are Dr. Fredda Leving, the roboticist.”
“Quite right. Now then, I am going to give you an order. You are to answer all my questions, and answer them as briefly as possible. Do not provide any information I do not ask for, or volunteer any information. Regard each question by itself. The questions will not be related to each other. Do you understand?” she asked.
“Certainly,” said Kaelor.
“Good.” Fredda was hoping, without much confidence, that she would be able to ask her questions in small enough pieces that no one question would present a First Law violation. And of course the questions would be related—that part was a baldfaced lie. But it might be a convincing enough lie to help Kaelor live through this. She knew for certain that asking, straight-out, the one big question to which they needed an answer would be absolutely catastrophic. She dared not ask for the big picture. She could only hope Kaelor would be willing and able to provide enough tiny pieces of the puzzle.
The trouble was, Kaelor had to know what she was doing as well as she did. How far would he be able to go before First Law imperative overrode the Second Law compulsion to obey orders?
There was one last thing she could do to help Kaelor. Fredda did not have any realistic hope that the Third Law’s requirement for self-preservation would help sustain Kaelor, but she could do her best to reinforce it all the same. “It is also vital for you to remember that you are important as well. Dr. Lentrall needs you, and he very much wants you to continue in his employ. Isn’t that so, Doctor?”
Lentrall looked up from the hole he was staring at in the floor, and glanced at Fredda before settling his gaze on Kaelor. “Absolutely,” he said. “I need you very much, Kaelor.”
“Thank you for saying so,” Kaelor said. He turned his gaze back on Fredda. “I am ready for your questions,” he said.
“Good,” said Fredda. It might well help Kaelor if she kept the questions as disordered as possible, and tossed in a few unrelated ones now and then. “You work for Dr. Lentrall, don’t you?” she asked.
“Yes,” said Kaelor.
“How long have you been in his employ?”
“One standard year and forty-two days.”
“What are the specifications for your on-board memory system?
“A capacity of one hundred standard years non-erasable total recall for all I have seen and heard and learned.”
“Do you enjoy your work?”
“No,” said Kaelor. “Not for the most part.”
An unusual answer for a robot. Generally a robot, when given the chance, would wax lyrical over the joys of whatever task it was performing.
“Why do you not enjoy your work?” Fredda asked.
“Dr. Lentrall is often abrupt and rude. He will often ask for my opinion and then reject it. Furthermore, much of my work in recent days has involved simulations of events that would endanger humans.”
Uh-oh, thought Fredda. Clearly it was a mistake to ask that follow-up question. She would have to reinforce his knowledge of the lack of danger, and then change the subject, fast, before he could pursue that line of thought. Thank Space she had turned down his pseudo-clock-rate. “Simulations involve no actual danger to humans,” she said. “They are imaginary, and have no relation to actual events. Why did you grab Dr. Lentrall and force him under a bench yesterday?”
“I received a hyperwave message that he was in danger. First Law required me to protect him, so I did.”
“And you did it well,” Fredda said. She was trying to establish the point that his First Law imperatives were working well. In a real-life, nonsimulated situation, he had done the proper thing. “What is the status of your various systems, offered in summary form?”
“My positronic brain is functioning within nominal parameters, though near the acceptable limit for First Law-Second Law conflict. All visual and audio sensors and communications systems are functioning at specification. All processing and memory systems are functioning at specification. A Leving Labs model 2312 Robotic Test Meter is jacked into me and running constant baseline diagnostics. All motion and sensation below my neck, along with all hyperwave communication, have been cut off by the test meter, and I am incapable of motion or action other than speech, sight, thought, and motion of my head.”
“Other than the functions currently deactivated by the test meter, deliberate deactivations, and normal maintenance checks, have you always operated at specification?”
“Yes,” said Kaelor. “I remember everything.”
Fredda held back from the impulse to curse out loud, and forced herself to keep her professional demeanor. He had violated her order not to volunteer information, and had volunteered it in regard to the one area they cared about. Only a First Law imperative could have caused him to do such a thing. He knew exactly what they were after, and he was telling them, as best he could under the restrictions she had placed on him, that he had it.
Which meant he was not going to let them have it. They had lost. Fredda decided to abandon her super-cautious approach, and move more quickly toward what they needed.
“Do you remember the various simulations Dr. Lentrall performed, and the data upon which they were based?”
“Yes,” Kaelor said again. “I remember everything.”
A whole series of questions she dared not ask flickered through her mind, along with the answers she dared not hear from Kaelor. Like a chess player who could see checkmate eight moves ahead, she knew how the questions and answers would go, almost word for word.
Q: If you remember everything, you recall all the figures and information you saw in connection with your work with Dr. Lentrall. Why didn’t you act to replace as many of the lost datapoints as possible last night when Dr. Lentrall discovered his files were gone? Great harm would be done to his work and career if all those data were lost for all time.
A: Because doing so would remind Dr. Lentrall that I witnessed all his simulations of the Comet Grieg operation and that I therefore remembered the comet’s positional data. I could not provide that information, as it would make the comet intercept and retargeting possible, endangering many humans. That outweighed the possible harm to one man’s career.
Q: But the comet impact would enhance the planetary environment, benefiting many more humans in the future, and allowing them to live longer and better lives. Why did you not act to do good to those future generations?
A: I did not act for two reasons. First, I was specifically designed with a reduced capacity for judging the Three-Law consequences of hypothetical circumstances. I am incapable of considering the future and hypothetical well-being of human beings decades or centuries from now, most of whom do not yet exist. Second, the second clause of the First Law merely requires me to prevent injury to humans. It does not require me to perform any acts in order to benefit humans, though I can perform such acts if I choose. I am merely compelled to prevent harm to humans. Action compelled by First Law supersedes any impulse toward voluntary action.
Читать дальше