“Wait, wait, wait…” said Stuyvesant. “What if there are biologics in here?”
“That’d be another unnecessary mousetrap,” Clover said.
“They could be unintentional—”
“We haven’t seen anything unintentional so far…. I believe the air will be safe.”
“I’m going to unseal,” Barnes said. “Sally, stand by.”
They all watched as he unclipped the faceplate on his helmet and took a deep breath. And held it. And let it out and took another. He took a few more breaths and licked his lips, tasting the air, and finally nodded. He pushed the faceplate closed again, resealed it, and said, “The air seems to be okay, but I want everybody to stay sealed. When we get back, I’ll go into isolation to check for biologics. Let’s go to Post-its.”
They all reached down and threw switches on the legs of their EVA suits. When upward pressure was placed on a boot, a pressure switch would cut the electrical charge and the boot would peel away from the floor with about the same resistance as Earth gravity.
As they all stuck to the floor, or deck, or whatever it was, new words appeared over the polychrome console. “Please say something to me.”
“Speakers and mikes, now,” Barnes said, and they all went to external speakers and microphones.
The phrase repeated in the same other dozen languages they’d seen in the first message.
Barnes said, “Hello. We’re from Earth. Uh, the third planet in this system.”
Colors shifted across the sides of the alien console making it look even more like a jukebox, and then it spoke: “American English. I can speak in American English. Now, what questions do you have?”
Barnes asked, “Who are you?”
The jukebox said, “I am not a ‘who’ but a ‘what.’ I am a low-grade artificial intelligence tasked with answering questions. I am programmed to understand thirteen human languages, five of them based on the probability of being the first-contact languages. In order, the probability for first contact was American English, Chinese, Russian, Arabic, and Portuguese. I am not a fully intelligent AI. I chain rhetorical logic via a statistical grammar. Though it may sound as though I’m being conversational, in fact I am always responding directly or indirectly to questions. My data storage has the answers to 71,236,340 explicit or implicit questions. I can synthesize new answers from those I am preprogrammed with, but at times you will ask questions for which I have no answer, to which I will reply, ‘I don’t know.’”
Barnes asked, “Can we set up camera equipment to record this conversation?”
“Yes. I will wait.”
Barnes nodded at Sandy, who’d had the mini-Red under his arm, recording first contact as clandestinely as he could. Now he broke three more cameras out of his carrying case and began setting them up in the bare room.
Clover asked the jukebox, “Are there any other species here now?”
The AI said, “No, you are the only species here at this time.”
Clover: “When are you expecting others to arrive?”
“I don’t know. That is not an omission from my database. There is no predefined schedule for arrival. Previous intervals between arrivals have ranged from two years to three hundred and ninety-six years.”
“Who made you?”
“I don’t know.”
“Where are your makers?”
“I don’t know.”
“Why don’t you know?”
“When they left, they didn’t tell me who they were or where they were going.”
“When did they leave?”
“One thousand seven hundred and fifty-three Earth years ago.”
Hannegan: “How old is this facility? The aliens… uh, the beings who recently left, they weren’t your makers?”
“This depot is 21,682 Earth years old, and I don’t know if the species that recently departed were my makers, because I don’t know who my makers were.”
Stuyvesant: “Can you tell us what the other species look like?”
“No. There may be some visual recording facilities on this depot, but I do not have access to them.”
Stuyvesant: “Do you provide this service to species other than humans? Do you speak languages not derived from Earth?”
“Yes.”
Barnes: “How can you run this depot with so little critical information?”
“I do not run this depot. It is separately automated. I am here to answer questions.”
Hannegan muttered, “Not very helpfully, so far.”
Clover wagged a finger at him: “Are you programmed to deny us information about your technology?”
“No. Ninety percent of my information is about technology. I contain complete descriptions, operation details, status reports, maintenance records, documentation, and instructional and design manuals for this station, and for its satellites.”
Barnes: “Tell us all about the depot.”
“That would not be a good idea.”
“Why?”
“I would not know what you would want to know. I would start with the first facts in my memory and proceed through the databases in an orderly manner. Done orally, it would take seventeen Earth years. Do you have sufficient time?”
Clover: “Don’t you have more efficient ways to transmit information than talking?”
“Of course, but I do not know which ones of them, if any, are usable by you. Technology changes very rapidly. In comparison, language changes extremely slowly. I doubt you are equipped with I/O protocols from even a century ago. But English, as spoken several centuries ago, would still be comprehensible to you today. If you have communications specialists I can talk to, we can probably find a mutually agreeable protocol.”
Clover: “And you are willing to transfer that data to us?”
“Yes.”
Barnes held up a hand to slow him down, then looked at the jukebox:
“You say you can’t tell us about your makers or other species. Is that because you are prohibited from sharing that information, or because it’s not in your databases?”
“Your question is not entirely correct. I do have some limited information about my makers and other species, but you have not asked the correct questions to elicit that response. As to your other point, I am not prohibited from answering any questions for which I have information. Everything I know or can synthesize is accessible to anyone who asks me questions.”
Clover jumped in: “What would be the correct questions that would elicit your programmed response about your makers and the other species?”
“The correct questions would be: First: ‘Are your makers afraid of us?’ The correct answer would be, ‘Not at this time.’ Second: ‘Should we be afraid of them?’ The correct answer would be, ‘Not at this time.’ Third: ‘Should we consider them hostile to our species?’ The correct answer would be, ‘No.’”
Emwiller looked at Barnes: “Sir, should we be trusting these answers?”
Barnes shrugged: “I don’t know.”
Clover asked the jukebox, “Is there some way we can determine if you’re telling the truth or not?”
“Not that I am aware of, but I have not been programmed to lie. I am not an advanced AI. I cannot construct elaborate fabrications. If I were to mix false information with the true, it is likely the questioner would eventually find a discrepancy or contradiction in my answers. Lying would also interfere with my function, which is to provide instructional information on how to best make use of this depot and to ensure that visitors do not harm the depot or themselves unintentionally.”
Clover turned to Barnes: “What we have here is the ‘all Cretans are liars’ problem. Its responses make sense, but this could be a very elaborate fabrication. I’d say that either this machine is pretty much as it seems, or it’s much more sophisticated than we can imagine, a very high-level AI, well beyond our systems, masquerading as a low one. I think we have to assume the former until proven otherwise, because there’s not much we can do if it isn’t true.”
Читать дальше