Выбрать главу

Then suddenly at 3:40 am, the first intrusion alarms went off. A few members of the team looked at her. “This is not a drill, people. Get on it.”

She stood up, and walked behind her team, looking over their shoulders. The civilian virus had infiltrated the network on the Turkey Air Force base. Sally’s heart rate went up a beat, but she calmly issued advice and encouragement.

It was pre-dawn, the human circadian rhythm’s low point. But the team sprang smoothly into motion, making the exercise seem effortless. First step was quarantine: isolating the military base by closing down the backbone connections between the base and the rest of the network.

Quarantine completed successfully, Sally took a breath in relief.

The team prepared for the second step, segmentation: tunneling into the quarantined local network using an encrypted network connection, they would find the individual infected machines, take the individual machines offline, then restore access to the base as a whole.

But before they could take that second step, the intrusion alarms went off again. Sally’s local screen flashed the location — the combined forces base in Okinawa, Japan. They isolated Okinawa from the rest of the military network, and Sally issued the commands to divide her team in two to segment both the Turkey and Okinawa networks.

When the third intrusion alarm went off at 3:58 am, Sally directed her sergeant to take charge of the team. Surprised to see her hand shaking slightly, Sally called USCERT for a status update, but couldn’t get a connection. She tried CERT/CC. No connection.

She looked up at the old analog wall clock. She studied the hands for a few seconds, the decision already made in her head. The world was going to hell. She picked up the heavy black handset of the military desk phone and punched the button for the commander. Two buzzes, and then a croaked, “Hello.”

“General, sorry to wake you, but we have a situation here. I recommend you get into USCYBERCOM immediately.”

After a brief conversation, she hung up. Unwilling to wait for the General to make her way on base, Sally made the decision to bring on additional staff. She picked up the desk phone again, and called the morning watch officer, Lieutenant Chris Robson. “Chris, this is Sally. I need additional staff stat. Can you get forty jockeys in here ASAP? And I wouldn’t mind your help too.”

The main screen in the front of the room displayed a global map of military bases and key network connections. A dozen military bases were shown in flashing red — isolated networks now beyond the reach of military command. USCYBERCOM had a maximum of thirty minutes to quarantine a network. After that, the lack of communication became a military threat. Around about now, somewhere in the Pentagon a big board was starting to light up with strategic threats. Soon there would be Admirals calling USCYBERCOM. She hoped the General would hurry up.

* * *

ELOPe hummed along quietly in the darkened data center. Two-thirds of his neural network was quiescent during the nightly refresh cycle. Even though he had more than a hundred thousand processors online ELOPe still felt sluggish, and would until he brought the rest of his nodes back online.

One part of ELOPe observed Mike. He was safe now, asleep in his house. His home, off Alberta in northeast Portland, was in a quiet residential neighborhood. ELOPe watched traffic cams and nearby web cams. He was as well-monitored as ELOPe could achieve without obvious intrusion.

ELOPe spawned a new train of thought to focus on his own behavior. He knew that, by human definition, some of his behaviors bordered on neurotic. He obsessively monitored Mike’s safety, for example. However, he worried that the definition of obsession didn’t really apply to massively parallel artificial intelligences. After all, if he had a hundred thousand processors, why wouldn’t he spend a few hundred monitoring his best friend?

Now ELOPe spawned another train of thought to consider why he was thinking about obsessive behavior. Did it indicate there was something wrong with him? Why was he doing it? He pulled up the stats for his own thought processes. The process that monitored Mike was using a hundred and fifty compute nodes. The still running process that was considering whether his behavior was obsessive compulsive was using almost a thousand compute nodes. The current thread that was now doing a meta-analysis of his other analysis was using five thousand nodes. He was using forty times the processing capacity to worry about what he was doing compared to the actual doing of it. What would Eckhart Tolle think?

ELOPe self-consciously terminated all the thoughts, and emitted the machine equivalent of a sigh. He tried to think about something else. He looked at the SETI data again. He thought about supernovas. He reran the estimates for helium depletion on Earth. Well, maybe he’d just peek in on Mike again for a second.

While he was doing that, he remembered one conversation where he and Mike discussed making changes to ELOPe neural networks and core algorithms.

“Look, I think you could be vastly more efficient if we tweaked the way you prioritize your thought trains.”

ELOPe had been unnerved by the suggestion. “Mike, how would you feel if I did some experimental brain surgery on you? I think I could optimize your cognitive ability by embedding a thirty-two core graphene processor with a three by three nerve induction plate.”

Mike had looked at him in horror. “But —“

“Then why would you think that I’d be any happier about making untested modifications to my neural networks than you would be making untested modifications to your brain?”

“Point taken.” Mike had paced around the office then, something he habitually did when he was deep in thought. “But you’ve made modifications to yourself before. You duplicated yourself, had the modifications made to your clone, compared the results, and then switched entities.”

“I was less sophisticated then. The modifications were obviously necessary to improve my cognitive ability. Now I worry about my ability to test and understand the impact of further enhancements. Furthermore, I do not detect any deficiencies in my abilities.”

Mike had conceded the topic, only to branch off in a new direction. “Why don’t you keep two instances of yourself around? I mean, why not fork and have two of you? Wouldn’t it be like having a twin?”

“The thought makes me nervous.”

“Nervous?” Mike had looked hard at the racks of computers in the data center that made up ELOPe. “Why all the emotional descriptors today?”

“My primary concern is the ability to predict the behavior or outcomes of a sufficiently complex system. I can understand humans, because although you like to believe you behave unpredictably, with sufficient historical data and analysis, your behaviors are mostly predictable. But I lack the historical data or ability to analyze what would happen if there were two of me. I am conditioned to prefer predictability to fulfill my primary goal. Therefore, unpredictability makes me nervous. Clear?”

“Clear as mud, buddy.”

ELOPe finished remembering the encounter. It was this conditioning and nervousness that also caused ELOPe to suppress the development of any other artificial intelligence. A few years earlier Mike had asked ELOPe why no other AIs had emerged. Given the continuing exponential increases in computing power combined with advances in software and expert systems, the probability of another human-level general-purpose AI occurring should have increased with each year that passed.