Baley said, “I’m afraid I don’t, either.”
“But I know what it means. I can’t explain it, but I feel the explanation without being able to put it into words, which may be why I have achieved results my colleagues have not. However, I grow grandiose, which is a good sign I should become prosaic. To imitate a human brain, when I know almost nothing about the workings of the human brain, needs an intuitive leap—something that feels to me like poetry. And the same intuitive leap that would give me the humaniform positronic brain should surely give me a new access of knowledge about the human brain itself. That was my belief — that through humaniformity I might take at least a small step toward the psychohistory I told you about.”
“I see.”
“And if I succeeded in working out a theoretical structure that would imply a humaniform positronic brain, I would need a humaniform body to place it in. The brain does not exist by itself, you understand. It interacts with the body, so that a humaniform brain in a nonhumaniform body would become, to an extent, itself nonhuman.”
“Are you sure of that?”
“Quite. You have only to compare Daneel with Giskard.”
“Then Daneel was constructed as an experimental device for furthering the understanding of the human brain?”
“You have it. I labored two decades at the task with Sarton. There were numerous failures that had to be discarded. Daneel was the first true success and, of course, I kept him for further study—and out of”—he grinned lopsidedly, as though admitting to something silly—“affection. After all, Daneel can grasp the notion of human duty, while Giskard, with all his virtues, has trouble doing so. You saw.”
“And Daneel’s stay on Earth with me, three years ago, was his first assigned task?”
“His first of any importance, yes. When Sarton was murdered, we needed something that was a robot and could withstand the infectious diseases of Earth and yet looked enough like a man to get around the antirobotic prejudices of Earth’s people.”
“An astonishing coincidence that Daneel should be right at hand at that time.”
“Oh? Do you believe in coincidences? It is my feeling, that any time at which a development as revolutionary as the humaniform robot came into being, some task that would require its use would present itself. Similar tasks had probably been presenting themselves regularly in all the years that Daneel did not exist—and because Daneel did not exist, other solutions and devices had to be used.”
“And have your labors been successful, Dr. Fastolfe? Do you now understand the human brain better than you did?”
Fastolfe had been moving more and more slowly and Baley had been matching his progress to the other’s. They were now standing still, about halfway between Fastolfe’s establishment and the other’s. It was the most difficult point for Baley, since it was equally distant from protection in either direction, but he fought down the growing uneasiness, determined not to provoke Giskard. He did not wish by some motion or outcry or even expression—to activate the inconvenience of Giskard’s desire to save him. He did not want to have himself lifted up and carried off to shelter.
Fastolfe showed no sign of understanding Baley’s difficulty. He said, “There’s no question but that advances in mentology have been carried through. There remain enormous problems and perhaps these will always remain, but there has been progress. Still—”
“Still?”
“Still, Aurora is not satisfied with a purely theoretical study of the human brain. Uses for humaniform robots have been advanced that I do not approve of.”
“Such as the use on Earth.”
“No, that was a brief experiment that I rather approved of and was even fascinated by. Could Daneel fool Earthpeople? It turned out he could, though, of course, the eyes of Earthmen for robots are not very keen. Wheel cannot fool the eyes of Aurorans, though I dare say future humaniform robots could be improved to the point where they would. There are other tasks that have been proposed, however.”
“Such as?”
Fastolfe gazed thoughtfully into the distance. “I told you this world was tame. When I began my movement to encourage a renewed period of exploration and settlement, it was not to the supercomfortable Aurorans—or Spacers generally that I looked for leadership. I rather thought we ought to encourage Earthmen to take the lead. With their horrid world—excuse me—and short life-span, they have so little to lose, I thought that they would surely welcome the chance, especially if we were to help them technologically. I spoke to you about such a thing when I saw you on Earth three years ago. Do you remember?” He looked sidelong at Baley.
Baley said stolidly, “I remember quite well. In fact, you started a chain of thought in me that has resulted in a small movement on Earth in that very direction.”
“Indeed? If would not be easy, I imagine. There is the claustrophobia of you Earthmen, your dislike of leaving your walls.”
“We are fighting it, Dr. Fastolfe. Our organization is planning to move out into space. My son is a leader in the movement and I hope the day may come when he leaves Earth at the head of an expedition to settle a new world. If we do indeed receive the technological help you speak of—” Baley let that dangle.
“If we supplied the ships, you mean?”
“And other equipment. Yes, Dr. Fastolfe.”
“There are difficulties. Many Aurorans do not want Earthmen to move outward and settle new worlds. They fear the rapid spread of Earthish culture, its beehive Cities, its chaoticism.” He stirred uneasily and said, “Why are we standing here, I wonder? Let’s move on.”
He walked slowly forward and said, “I have argued that that would not be the way it would be. I have pointed out that the settlers from Earth would not be Earthmen in the classical mode. They would not be enclosed in Cities. Coming to a new world, they would be like the Auroran Fathers coming here. They would develop a manageable ecological balance and would be closer to Aurorans than to Earthmen in attitude.”
“Would they not then develop all the weaknesses you find in Spacer culture, Dr. Fastolfe?”
“Perhaps not. They would learn from our mistakes.—But that is academic, for something has developed which makes the argument moot.”
“And what is that?”
“Why, the humaniform robot. You see, there are those who see the humaniform robot as the perfect settler. It is they who can build the new worlds.”
Baley said, “You’ve always had robots. Do you mean this idea was never advanced before?”
“Oh, it was, but it was always clearly unworkable. Ordinary nonhumaniform robots, without immediate human supervision, building a world that would suit their own nonhumaniform selves, could not be expected to tame and build a world that would be suitable for the more delicate and flexible minds and bodies of human beings.”
“Surely the world they would build would serve as a reasonable first approximation.”
“Surely it would, Mr. Baley. It is a sign of Auroran decay, however, that there is an overwhelming feeling among our people that a reasonable first approximation is unreasonably insufficient.—A group of humaniform robots, on the other hand, as closely resembling human beings in body and mind as possible, would succeed in building a world which, in suiting themselves, would also inevitably suit Aurorans. Do you follow the reasoning?”
“Completely.”
“They would build a world so well, you see, that when they are done and Aurorans are finally willing to leave, our human beings will step out of Aurora and into another Aurora. They will never have left home; they will simply have another newer home exactly like the other one, in which to continue their decay. Do you follow that reasoning, too?”
Читать дальше