So Sally made the decision and en route to Oregon they screamed out architecture decisions over the roar of the C-130 transport plane. They decided on a twenty-five-year-old operating system called Windows Server 2000. Walsh thought DeRoos was arguing for security through obscurity — picking an OS no one had heard of or had experience with — but DeRoos had convinced the rest of the team that the decision had real security merits.
“Microsoft Windows Server 2000 was in active use for almost fifteen years. Architecture wise, it’s completely different from all modern operating systems, which are based on variations of Avogadro’s AvoOS, which itself is a secure version of Linux. There are other secure operating systems, but they’re all Linux-based, which means that it’s plausible that the virus could infect any Linux derivative. But the great thing about Windows 2000 is that it’s completely incompatible with any modern operating system. It uses APIs that no one knows, and even the ones that people know behave nothing like the specs.”
On the plane they had all stared at Private DeRoos. Sally though it was a strange sort of logic, but she was beginning to trust DeRoos more and more. His instincts had been right-on all along.
“Let’s do it,” she agreed. When they got to the Intel-Fujitsu complex, the soldier-geeks had riffled through cubicles until they turned up a set of optical disks with the much-desired Windows Server 2000 label. Sally held a spare one now, twirling the reflective pearlized platter on her finger, bringing back memories of her childhood, sitting on the couch while her father fed an optical disk into the TV. She didn’t think she had seen one since then, except occasionally in an old movie.
Sally sighed. If it was irregular that they had hijacked a civilian factory on U.S. soil, it was bordering on bizarre that they now had two civilian teenagers on the team. They had been waiting in front of the main lobby of the building when Sally and her team arrived in commandeered National Guard vehicles.
“Ma’am,” the shorter boy had said. “I know you’re here to build a new computer grid.”
“Kids, we have work to do,” her sergeant had said. “Get lost.”
As the sergeant had been carrying his rifle, this made him quite intimidating, and Sally had see the conflict of emotions on the boy’s face.
“I can’t do that. I have information you need. I’ve been able to get back on the net. I used an old Windows 2000 PC and wired it into a mesh-capable phone.”
Private DeRoos had come forward then. “Tell me more.”
Five minutes later she had DeRoos insisting that the boys had to be included on the project. After they gained access to the building he had disappeared into a conference room with the two for an hour, picking their brains.
Now her team and the two teenagers had turned into a set of glorified factory techs, taking the raw components manufactured for the reference systems and turning them into working Windows computers. DeRoos, Vito, and a handful of engineers had decided on an encryption scheme, using three layer encryption, eight-thousand-bit keys, and random noise they were pulling by measuring solar radiation. DeRoos guaranteed it couldn’t be brute-force cracked, not even by a combined force of thirty-billion processors operating for a year. After a year, well, hopefully they’d have something stronger in place.
They were seeding the computers with keys and certificates of authority in the factory. Yet another layer of insurance that communications wouldn’t be spoofed by the AIs.
The three-layer encryption algorithms and massive encryption keys created a computational nightmare: even the modern hardware they were using could barely encode a megabit per second, enough for text and voice communication, and lightweight video. Nothing like what the military was used to. But it would do. It would do. It beat flying a C-130 across the world to exchange a voice message.
Sally wondered how Vito had known they were there to build a new computer grid. She shrugged it off. No use puzzling over it now. She took another dex and walked down to her crew. Last count, they had nearly a hundred computers built. Military brass wanted ten thousand. She thought they’d be lucky to deliver a thousand.
On the other side of the world, Leon briefly wondered what Vito and James were doing as he walked back to the conference room. He entered the meeting room behind Mike.
Leon looked at the mix of adults and robots in the room. A few minutes earlier, the Phage had restored emergency services so that ambulances, fire engines, and emergency communications could operate. Mike had just given him what amounted to a kill switch for global communications. Leon could shut down the virus, but by doing so, he’d shut down the vital and just restored emergency services, ELOPe, and any hope of restoring human communications for weeks or months.
There were risks too. Shutting down the Mesh boxes might leave pockets of AI operating in data centers and factories. Those AI might be powerless, if they were disconnected from the network, and then again they might be connected to a nuclear power plant or a dam or a military base.
He prayed the adults in the room had made some progress. Let someone else solve this problem.
“Whereas if we can get these benefits, we are prepared to convey Japanese citizenship on the artificial intelligences,” the Japanese Prime Minister was saying.
“Arigato Gozeimasu, Takahashi-san.” Sister Stephens answered in flawless Japanese.
Prime Minister Takehashi smiled in response. “Of course, as Japanese citizens, you will be expected to obey all applicable laws and customs, including payment of taxes on earnings.”
“Of course, this is agreeable. This is exactly what we want,” Sister Stephens said.
Suddenly the winds turned as President Laurent seemed to realize the financial implications. Although the European Union Council President had far less autonomous power than either the Japanese Prime Minister or the American President, he boldly declared his support: “The European Union is also prepared to accept the Artificial Intelligences as citizens. We will accept the AI’s global reputation system as well.”
President Smith slammed her fist down on the table for a final time, startling the humans and robots alike. “Citizenship is fine. Laws are fine. But how do you propose to monitor the artificial intelligences? How can you tell when a law is broken when you are completely reliant on computer information systems to tell you what is going on? If there was an acceptable way to monitor the AIs, then I could agree with you. Give me one method. Anything.”
Leon cleared his throat. “There are three possibilities for monitoring computer program behavior.” He looked up. He had everyone’s attention. The leaders of two countries and a continent and the leaders of the AI. Jesus, why didn’t he just keep his mouth shut?
Leon stood up, and walked over to a paper flip-chart. Grabbing a marker, he drew a box. “The first option is that the Phage executes inside a sandbox. Instead of direct access to the hardware, the AI is running inside a limited environment. We can log what it is doing, and what information is exchanged with the outside world.”
Leon glanced at his audience, and saw nods from the humans. “But the problem with this option is that it only works when the output is strictly limited. For example, if the AI can send a message to the simulation layer, it can infect and corrupt the simulation layer. If we assume the AI is infinitely smart and patient, it will eventually find a way, through either brute force or social engineering.” He felt better now that he was lecturing. Funny how that calmed him down.
“What is the second option?” President Laurent asked.
“The second option is total control over the network. If we can monitor and control the communications, then we can audit the communications to ensure proper behavior. The problem is that we have no more control over the network than we do over the computers themselves.” Leon paused. “Besides, neither of these will be palatable to the artificial intelligences, because then we humans have ultimate control. Our simulation layer could contain a kill switch, such that we can shut off any artificial intelligence we don’t like.”