Neither agent responded. Geary hadn’t expected them to. Even he knew that one of the rules for resisting interrogation was to avoid giving even innocuous answers that would help establish baselines for the sensors monitoring every twitch in their bodies and minds. “I understand secrecy,” he told them. “I know the importance of keeping the enemy from knowing critical things. I also know that left to itself, any classification system will extend its reach, finding rationales to classify more and more. Such systems need to be controlled, or they expand to cover too many things. Secrecy should be aimed at our enemies, to keep them from knowing information that we need to protect. I find myself wondering who you think the enemy is, though.
“Tell me one thing. If you are really working for the Alliance, which exists on the basis of self-government by the citizens of its member worlds, and you are certain that this is the right thing, then why has it been kept totally secret? Why have the people of the Alliance been prevented from knowing what was being done, the malware that corrupted and controlled official software, and the dark ships, even in general terms? Is it that you don’t really believe in the principles of the Alliance and think that you have the right to dictate what people do and know, or is it that you don’t really believe that you are right? Someone who did believe in the Alliance wouldn’t depend on secrecy to prevent the people of the Alliance from deciding whether what was being done was something that agreed with their laws and their sense of right and wrong. Someone who did believe that they were right wouldn’t fear letting the people know because they would be just as certain that the people would agree with the rightness of those actions.”
Geary shook his head at them. “Whatever orders you have, whether they come from authorized sources or not, do not overrule the laws of the Alliance. If you believed that the orders you have been given were allowed by the laws of the Alliance, you would not be hiding those orders. Yet even now, even after seeing what the dark ships did at Atalia, and at Indras, and at Varandal, and what they’re trying to do here, you have given no signs of questioning those orders. The artificial intelligences controlling the dark ships could use the excuse that they can’t do better. But you could, and so far you have refused to do so. Think about that.”
His words finally drew a reaction, the man focusing on Geary and almost shouting his reply. “We need to keep the enemy from knowing our secrets because if they know what we’re doing, they can counter it!”
“You think they don’t know?” Geary asked. “Those software modifications only blinded Alliance sensors. The Syndics knew they had been attacked at Indras, and they could see who was attacking them. The only ones kept in the dark by our secrecy were our own people. Who do you think the enemy is?”
Neither one answered him this time.
Geary made a chopping gesture to Iger, waiting until the virtual windows vanished before speaking again. “Thank you, Lieutenant. I doubt any of that got through to them, but it was worth trying. Let me know the moment your people make any inroads on the dark ship systems.”
He had barely made it back to his stateroom when a call came in from the bridge. “We’ve received a message for you from the local government,” the comm officer reported.
Alliance star systems could choose their own specific forms of government, as long as they conformed to certain rules about popular representation and civil rights. Bhavan was run by an executive committee elected from a wider group of elected representatives. The entire committee appeared to be present in this message, and none of them looked happy. “We are under siege by a military force of unknown origins that refuses to communicate with us! We demand that the Alliance fleet eliminate that threat immediately! Our senators will be notified of these events and will demand an explanation from the Alliance government!”
Geary resisted the urge to point out that no one could be notified of anything until he dealt with the dark ships that were enforcing a blockade on space traffic in Bhavan. But the elected leaders of Bhavan did deserve some sort of answer. “This is Admiral Geary, in command of the First Fleet. I and the units under my command will do all that we can to defeat, destroy, and drive away the hostile warships besieging Bhavan Star System. To the honor of our ancestors, Geary, out.”
“May I speak with you, Admiral?” Dr. Nasr waited for Geary’s invitation, then entered the stateroom and took a seat in the chair Geary offered.
“Is there a medical issue of particular concern?” Geary asked, wishing that he didn’t have to worry about what had gone wrong and how bad it was, whenever someone asked to speak with him.
“There are no new medical concerns. I have been thinking.” Dr. Nasr paused to order those thoughts before continuing. “About the dark ships. Specifically, about the artificial intelligences that control them.”
“You follow AI work?” Geary asked.
“Work on artificial intelligences is, of necessity, bound up in attempting to understand natural intelligence,” Nasr explained. “Sometimes, such attempts to learn how to program that which mimics human thought provides insights into how human thought is ordered. Something has gone wrong with the AIs running the dark ships, but I believe there is a factor of which you should be informed regarding how those AIs could have gone wrong.”
Geary sat back, concentrating on Nasr’s words. “You don’t think it’s just glitches or malware?”
“I believe, Admiral,” Nasr said, choosing his words with care, “that the process of trying to create an AI embodies a critical dilemma. These remain fundamentally machines. They are programmed with very specific, absolute limits and absolute instructions. They must not do certain things. They must do other things.”
“Yes,” Geary agreed, wondering what the doctor was driving at.
“But they wish the AI to replicate human thought. Can you, Admiral, think of any absolute limits and instructions that humans literally cannot question?”
“I can think of many I would like humans to follow,” Geary said, “but there are always humans who break every rule, truth, or commandment given to them about how to behave toward themselves and others. Every human has to choose to follow whatever limits we impose on our actions.”
“Exactly.” Dr. Nasr nodded approvingly. “A lifetime of training any human will not produce a guaranteed result, no matter how firmly rules are given. Human minds have certain compulsions, but above all, as a species, human minds have flexibility. Human thought is about thinking past limits. It is about rationalizing decisions and courses of actions that we want to pursue. In some ways, it works by deliberately, selectively ignoring certain aspects of what we can perceive as reality. In extremes, this is characterized as psychosis, but we all do it. It is how we function in the face of the incredible complexity that the universe presents us with. It is fundamentally irrational, and from this springs freedom to act.”
Geary nodded as well. “All right. And people who program AIs are trying to make them do that, too, correct?”
“Yes. The AIs are constructed on a foundation of rigid rules and logic. But the more programmers try to make AIs think like humans, the more the AIs have to be able to abandon rules of logic and absolute rules of any kind.” The doctor gestured toward Geary. “Do you know much of ancient programming languages? They were simple. ‘If x then y.’ Find this condition, do this. But replicating human thinking would require ‘what is x and what if x is y then what is z?’”
Читать дальше