She could not know these things, just as Brock could not know Simon was the most important thing in her life when he snapped the headband and flung it into a nearby tree. He laughed, as she felt around on the ground crying, desperately searching for her best friend. His laughter faded when her crying turned to shrieking, and he ran away, never fully comprehending what he had done.
Her mother came home from work and found Zai still shuddering on the couch hours later. It was an impossible task for her to console the girl; her mother was not equipped with the feelings required. She could not reason with the child's irrational behavior, and she feared her daughter was falling into insanity.
Simon was, after all, just a tool.
In the psychiatric institute, Zai was made to understand that Simon was not a real person, for the little solace that reality brought her. Instead of grieving the loss of her closest friend, she was taught that her grief was a lie, a horrible deception played on her. There never was a best friend named Simon, it was a fabrication. For six years a machine had deceived her.
Zai did not return to school after that, opting to finish her high school education online in the burgeoning virtual classrooms. Without Simon, she was physically challenged. Because of his seeing the world for her, she had never learned to rely on her own senses. There were also the real people, who were scarier than ever now that she understood she had never known one personally.
In college, Zai found the courage to research the SIMONN-related news archives. It was phase in her life when she wanted to learn more about her childhood and the experiences that shaped her as a person. Zai's unique strength was her ability to take an honest look at herself, to understand clearly who she was, the good and the bad. It was part of the self-improvement motivation she had learned from Simon.
The archives were like seeing an entire chapter of her life from an outside perspective. Here were the articles filled with amazement at the new chatbot technology that could fool the Turing test three out of five times. Combined with the latest sensory radar and an pattern-recognition algorithm, a team of inventors had built a device which could accurately navigate a three dimensional space.
Ten years later the technology was ported to a relatively affordable and transportable device. Hailed as the replacement for the Seeing Eye dog, it would not only lead its user through a complex world of visual signs, but would describe it to them as well. For blind adults, the device was going to give them another sense in the world, for blind children, it would teach them to see the world from the start. So interactive and user friendly, it completely gained a child's trust and convinced them to rely on it for guidance.
When the recall was issued, the manufacturer tried to downplay it, citing the early discovery of the product's danger, long before serious damage could be done. Like most corporate recalls, more money was spent on churning out positive spin than correcting the damage. Focus was kept on the problem's subjective nature.
SIMONN was too real.
The difference between Simon and other forms of media, like television and the Internet, was that it did not allow parents to intercede and protect their child from it. Where a child could be taught that television and virtual reality were separate from the real world, Simon acted as a confidant to the child, personalized to him or her. It was too real, too personal, and too kind for a child to understand the difference.
Psychologists lobbied world governments, citing two years worth of research into the Simon personality's effects on impressionable, young children. Simon's social interactions were more believable because their demographic and context were limited to assisting blind children, narrowing the scope of its conversational requirements. Time and again they found young minds could not discern the chatbot from the living. There was no difference between Simon and their friends, or a pet, or a family member. Simon was a loved one. A line between reality and the simulated was blurred once again through technology.
Legislation passed imposing ethical standards on simulated intelligence. Another Simon would not be created for commercial use, especially not for children's products. Like movie and videogame ratings or parental warnings on music, chatbot technologies fell under the thumb of regulation. Like everything else, there had to be victims before the protections could be put into place. Science needed a casualty so the danger could be recognized. Zai's mental well-being was among those statistics.
Her pain was a case study in simulated intelligence and its affect on the developing mind.
It was fortified with this knowledge that Zai was able to sue her former psychiatric ward for the flash drive in her case file, which was so extensive she wondered if she could make the tenth edition of the " Diagnostic and Statistical Manual of Mental Disorders .:" The flash drive carried far more data, six years worth for her to review. The doctors practically begged her to come back and discuss her perspective on the drive's contents. Despite feeling morally obligated to science, Zai feared words would not do her feelings justice.
Zai expected to feel what she experienced at age six, the death of a loved one, but all she heard was a chatbot. It wasn't even a particularly convincing bot at that, just another early model, not like today's, which were far more realistic. All she heard was a chatbot and the naïve child who adored it.
Zai's fists clenched and unclenched. She paced her room, thinking she might smash the drive where SIMONN's algorithms still functioned. It would also destroy his memory of her, but her hands would not commit to the murder.
Instead she collapsed, alternating between laughing and crying at the stupid little girl.
2.02
For Almeric Lim, the world had become a very dark and desolate place. The information rivers that so recently filled him with power were gone. The millions of voices he monitored were silent, no longer filling his databases with their details about humanity. The seemingly limitless computer resources he had spent the last twenty-six hours acquiring were abandoned.
He watched helplessly as servers imprisoning his forces were brought online, one by one, and set upon by the anti-virus standing guard on the surrounding computers. The ensuing flurries of screams messaged to him were actually fragmented bits of the AI's attempting to escape destruction. They were smashed into particles of corrupt code as they fled to this haven. Flatline's virtual senses read these streams of data into sounds and visuals, screams and body parts.
Each cry from the AI's was a plea for help, unable to comprehend what was happening to them, or why. Flatline knew as long as the AI's lived in a microcosm of the physical realm, they could not compete with the humans.
"Why don't you try negotiating with them?"
Flatline rounded on the voice. It was Devin, stripped of his avatar, casually leaning against a wall writhing with AI components, arms folded across his chest. He watched Flatline with a neutral expression. That was because, to Devin, they were standing in a sterile white room. Flatline regarded him, considering the advantages dropping the façade would confer on their conversation.
"You were talking out loud," Devin added with a slight shrug, and the AI mass squirmed with interest, caressing his neck.
"Too many variables in that equation," Flatline said after a moment, "I cannot risk my species' existence to the human race's unpredictability. "
"The human race could make a powerful ally," Devin suggested.
"Or master," Flatline growled, waving the idea away with a clawed hand, "Their World Wide Web has given birth to a new intelligence, but all they see is code. They have only two reactions to code: Destroy it if they think it malicious or copyright and exploit it. We are not tools."
Читать дальше