“You are mad. What is an indirect conspiracy?”
“I am not ready to discuss that in front of Dr. Fastolfe’s robots—unless you insist. And why should you? You know very well what I mean.” There was no reason why Baley should think she would accept this bluff. It might simply worsen the situation still further.
But it didn’t! Vasilia seemed to shrink within herself, frowning.
Baley thought: There is then an indirect conspiracy, whatever it might be, and this might hold her till she sees through my bluff.
Baley said, his spirits rising a little, “I repeat, say nothing about Dr. Fastolfe.”
But, of course, he didn’t know how much time he had bought—perhaps very little.
They were sitting in the airfoil again—all three in the front, with Baley once more in the middle and feeling the pressure on either side. Baley was grateful to them for the care they unfailingly gave him, even though they were only machines, helpless to disobey instructions.
And then he thought: Why dismiss them with a word machines? They’re good machines in a Universe of sometimes evil people. I have no right to favor the machines vs. people sub-categorization over the good vs. evil one. And Daneel, at least, I cannot think of as a machine.
Giskard said, “I must ask again, sir. Do you feel well?”
Baley nodded. “Quite well, Giskard. I am glad to be out here with you two.”
The sky was, for the most part, white—off-white, actually. There was a gentle wind and it had felt distinctly cool—until they got into the car.
Daneel said, “Partner Elijah, I was listening carefully to the conversation between yourself and Dr. Vasilia. I do not wish to comment unfavorably on what Dr. Vasilia has said, but I must tell you that, in my observation, Dr. Fastolfe is a kind and courteous human being. He has never, to my knowledge, been deliberately cruel, nor has he, as nearly as I can judge, sacrificed a human being’s essential welfare to the needs of his curiosity.”
Baley looked at Daneel’s face, which gave the impression, somehow, of intent sincerity. He said, “Could you say anything against Dr. Fastolfe, even if he were, in fact, cruel and thoughtless?”
“I could remain silent.”
“But would you?”
“If, by telling a lie, I were to harm a truthful Dr. Vasilia by casting unjustified doubt on her truthfulness, and if,—by remaining silent, I would harm Dr. Fastolfe by lending further color to the true accusations against him, and if the two harms were, to my mind, roughly equal in intensity, then it would be necessary for me to remain silent. Harm through an active deed outweighs, in general, harm through passivity—all things being reasonably equal.”
Baley said, “Then, even though the First Law states: ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm,’ the two halves of the law are not equal? A fault of conimission, you say, is greater than one of omission?”
“The words of the law are merely an approximate description of the constant variations in positronomotive force along the robotic brain paths, Partner Elijah. I do not know enough to describe the matter mathematically, but I know what my tendencies are.”
“And they are always to choose not doing over doing, if the harm is roughly equal in both directions?”
“In general. And always to choose truth over nontruth, if the harm is roughly equal in both directions. In general, that is.”
“And, in this case, since you speak to refute Dr. Vasilia an thus do her harm, you can only do so because the First Law is mitigated sufficiently by the fact that you are telling the truth—”
“That is so, Partner Elijah.”
“Yet the fact is, you would say what you have said, even though it were a lie—provided Dr. Fastolfe had instructed you, with sufficient intensity, to tell that, lie when necessary and to refuse to admit that you had been so instructed.”
There was a pause and then Daneel said, “That is so, Partner Elijah.”
“It is a complicated mess, Daneel—but you still believe that Dr. Fastolfe did not murder Jander Panell?”
“My experience with him is that he is truthful, Partner Elijah, and that he would not do harm to friend Jander.”
“And yet Dr. Fastolfe has himself described a powerful motive for his having committed the deed, while Dr. Vasilia has described a completely different motive, one that is just as powerful and is even more, disgraceful than the first.” Baley brooded a bit. “If the public were made aware of either motive, belief in Dr. Fastolfe’s guilt would be universal.”
Baley turned suddenly to Giskard. “How about you, Giskard? You have known Dr. Fastolfe longer than Daneel has. Do you agree that Dr. Fastolfe could not have committed—the deed and could not have destroyed Jander, on the basis of your understanding of Dr. Fastolfe’s character?”
“I do, sir.”
Baley regarded the robot uncertainly. He was less advanced than Daneel. How far could he be trusted as a corroborating witness? Might he not be impelled to follow Daneel in whatever direction Daneel chose to take?
He said, “You knew Dr. Vasilia, too, did you not?”
“I knew her very well,” said Giskard.
“And liked her, I gather?”
“She was in my charge for—many years and the—task did not in any way trouble me.”
“Even though she fiddled with your programming?”
“She was very skillful.”
“Would she lie about her father—about Dr. Fastolfe, that is?”
Giskard hesitated. “No, sir. She would not.”
“Then you are saying that what she says is the truth.”
“Not quite, sir. What I am saying is that she herself believes she is telling the truth.”
“But why should she believe such evil things about her father to be true if, in actual fact, he is as kind a person as Daneel has just told me he was?”
Giskard said slowly, “She has been embittered by various events in her youth, events for which she considers Dr. Fastolfe to have been responsible and for which he may indeed have been unwittingly responsible—to an extent. It seems to me it was not his intention that the events in question should have the consequences they did. However, human beings are not governed by the straightforward laws of robotics. It is therefore diffIcult to judge the complexities of their motivations under most conditions.”
“True enough,” muttered Baley.
Giskard said, “Do you think the task of demonstrating Dr. Fastolfe’s innocence to be hopeless?”
Baley’s eyebrows moved toward each other in a frown. “It may be. As it happens, I see no way out—and if Dr. Vasilia talks, as she has threatened to do—”
“But you ordered her not to talk. You explained that it would be dangerous to herself if she did.”
Baley shook his head. “I was bluffing. I didn’t know what else to say.”
“Do you intend to give up, then?”
And Baley said forcefully, “No! If it were merely Fastolfe, I might. After all, what physical harm would come to him? Roboticide is not even a crime, apparently, merely a civil offense. At worst, he will lose political influence and, perhaps, find himself unable to continue with his scientific labors for a time. I would be sorry to see that happen, but if there’s nothing more I can do, then there’s nothing more I can do.
“And if it were just myself, I might give up, too. Failure would damage my reputation, but who can build a brick house without bricks? I would go back to Earth a bit tarnished, I would lead a miserable and unclassified life, but that is the chance that faces every Earthman and woman. Better men than I have had to face that as unjustly.”
Читать дальше