“At the Eschaton,” Singer said, “various Earth organizations—groups, sects, and even nation-states—sent out generation ships in a desperate bid to save some scrap of humanity, because the best-case scenario did not seem as if it would leave the homeworld habitable for long. One hundred seventy-three ships are known to have made it at least as far as the edge of the solar system.”
“Like stations, with no primary. Just… sort of drifting along, trying to be totally self-sufficient.”
“Yes,” he said.
It was a terrifying risk, a desperation gamble, and we both paused to appreciate it.
Then I said, “But Earth didn’t die.”
“Earth didn’t die,” Singer said. “But those generation ships did. As far as we know, anyway—their planned paths have been searched, once it became trivial to do so, but very few have been recovered.”
Connla looked up from his game board. One hand was resting carelessly inside the projection, and it made him look like his arm was half-amputated.
“Waste it,” Connla breathed. “They lost all of them?”
“One made it,” Singer said. “Sort of. But the people and the shipmind within it had changed too much to be integrated back into society. They took another way out.”
Connla said, “Suicide?”
“They transubstantiated,” Singer said. “Went into machine mind, totally, and took off in swarms of some Koregoi nanotech to inhabit the cosmos.”
“So, suicide,” I said. “With some plausible deniability built in.”
“Apparently,” Singer said, “the tech they were using allowed continuity of experience across platforms.”
“Continuity of experience,” I said. “But the thoughts themselves necessarily change, from meat-mind to machine.”
“Well, they were derived from one of the religious cults anyway, and very into the evolutionary perfection of humanity toward some angelic ideal.”
“Right,” I said. I’d heard of this. The Jacob’s Ladder . A famous ship from history. Like the Flying Dutchman or the Enola Gay . There was always some attraction, of course, to leaving your meat-mind behind and creating a version of yourself that lived entirely in the machine. But that creation wasn’t you; it was a legacy. A recording. A simulation.
Not because of any bullshit about the soul, but because the mind was the meat, and the meat was the mind. You might get something sort of like yourself, a similar AI person. It might even think it was you. But it wouldn’t be you.
Still, I guessed, it was better than nothing.
I wondered whether those swarms were still around, and what they were doing out there if they were. “You think the parasite is a nanoswarm?”
Singer snorted with mechanical laughter, which I took to mean agreement, or at least not seeing any reason to disagree.
He said, “I think there’s insufficient evidence to speculate.”
“That’s Singer for, ‘That’s as good an explanation as any.’ ”
He said, “Funny how, after all those ans of trying and failing to create artificial intelligence, the trick that worked was building artificial personalities. It turns out that emotion, perception, and reason aren’t different things—or if they are, we haven’t figured out how to model that yet. Instead, they’re an interconnected web of thought and process. You can’t build an emotionless, rational, decision-making machine, because emotionality and rationality aren’t actually separate—and all those people who spent literally millennians arguing that they were, were relying on their emotions to tell them that emotions weren’t doing them any good.”
He paused for slightly too great a duration, in that way AIs will when they’re unsure of how long it might take a meatform to process what they’re saying.
I sighed. “Come on, Singer,” I urged. “Bring it home. I know you’ve got it in you.”
He issued a flatulent noise without missing a beat. “You were in a hurry to get somewhere?”
“No, just wondering when we were going to find out where we were going.”
“Tough crowd,” he answered. “But I guess in that case you aren’t in need of softening up. Okay, what I was wondering is this: Is your Koregoi not-a-parasite a sentient? And if so, what is it feeling right now? And what does it want?”
I thought about that. With my emotions and with my logic. For… a few minutes, I guess; my face must have been blank with shock as I worked through the implications.
“I wish you hadn’t said that,” I said.
“It might not be an accurate assessment of the situation.”
That was Singer for comforting. For the first time in a decan what I really wanted was a hug. I took what I could get, instead.
“Well, it is what it is. If it tries to send me smoke signals, I’ll worry about it then. Whatever is going to happen is already happening.” I put my head in my forehands. “Right this second, I’m sort of wishing I could order everyone to shut up.”
“It’s okay,” Connla said kindly. “We’re glad you can’t.”
“But we can program people to be responsible adults!” Singer said.
“And you don’t see a problem with that?”
“Programming an intelligence? It would be hypocritical of me, don’t you think?” Singer had a way of speaking when he was making a point that always made me think of slow, wide-eyed, gently sarcastic blinking.
“Not everybody agrees with their own programming,” I said. “Not everybody likes it. Some of us have gone to lengths to change it.”
“Some of you were raised in emotionally abusive cults,” Singer replied brightly.
“…Fair.” I massaged my temples and didn’t say, Some of you were programmed to have a specific personality core by developers of a different species, too.
Connla said, “But where’s the line between rightminding and brainwashing? Or, in the case of an AI, programming for adequate social controls versus creating slave intelligences?”
“It’s not late enough at night and I’m not drunk enough for this conversation,” I said.
“We can print you some intoxicants,” Connla said.
“Night is a null concept under these conditions,” Singer said.
I considered throwing a cat at him. If he had had a locus persona, I might have.
He continued, “It’s true. I was created by my team of parent AIs and human programmers from a menu of adaptations. They wanted me to be curious and outgoing and not take things at face value. To investigate and theorize.” I could almost hear the face he would have been pulling if he had a face to pull faces with. “They also had a remit from the sponsors, of course.”
Given the debt payments we were still making to keep Singer out of hock, I was pretty aware of that. But Singer figured out early that meat-minds require a fair amount of repetition, and he’s scrupulous about providing it. He’s still better than a lot of AIs, really, being more socially aware. Some of them exist on the tell-you-three-times rule, and let me tell you three times, the reminder algorithms they use on us poor meatheads aren’t that varied or subtle.
“They got their money’s worth,” I said, and through the shared ship sensorium, I felt Singer beaming.
“That reminds me,” he said. “Something doesn’t make sense to me about the not-a-parasite.”
“Only one thing?”
“What the hell was the booby trap doing there? And what was it for? It doesn’t make any sense.”
I had something clever on the tip of my tongue, but it never got said. Bantering with Singer and Connla was recreation on these long trips. But I faltered, and considered, and after a little while I said, “I hadn’t thought about that.”
Читать дальше