“That’s it?” Adrian asked surprised, she smiled. “Yes that’s it, the nanites are in your system, the implant will be operational in a few hours, I recommend that you go to your quarters and read the instruction manual until it comes online. You are free for the rest of the day, I’m sure that Laura will send you instructions for tomorrow.” Then she stood up said her goodbyes and left the room. Adrian went to one of the cabinets and retrieved his clothes, he put them on and went to his quarters, trying to sense if there was something different. But he couldn’t feel anything, so he finally gave up and took up his datapad, he accessed his mail and retrieved the manual the doctor sent him. He sat in the chair by his desk and started reading.
Operational manual for CX-01 Advanced Integrated Systems Implant
The next few pages concerned the manufacturer, so he scrolled down to the part that interested him.
—when the implant construction is finished, it will run a self-diagnostic program intent to verify the implant integrity, if a fault is found, the user will be notified, we urge the user that in this case she/he immediately contact any medical personnel available.
— In case of successful construction and integration, the user interface will notify the user that the implant was successfully activated, and a series of questions will be asked of the user, measuring hers/his responses in order to achieve maximum integration and creation of required neural pathways.
Adrian frowned the manual repeatedly referenced that the user will be notified or asked question, but the manner in which this was to occur wasn’t stated, he tried scrolling back to see if he missed anything but when he couldn’t find anything, he decided to continue reading.
—after the initial setup is completed, a menu will be available to the user with a smart app software system that will help the user modify hers/his implant interface.
—the option to activate the implant’s Ai (Artificial intelligence) will be available after the interface is set.
—the Ai has two modes:
- Active
- Passive
—During active mode the Ai is able to access everything the user sees and hears, and also monitor the interaction between the implant and the user.
—During passive mode the Ai is only able to manage data inside of its own core, including the control of the user’s nanites (also referred to as privacy mode)
Adrian continued reading for the next couple of hours, the manual described multitude of situations and the recommended reactions to them. But there was very little in term of information about the interaction with the Ai, and the manner in which the user should interact with it. After a while he grew tired, and since there wasn’t anything happening he decide to go to bed, so he got up and started towards his bed. He didn’t take more than two steps before letters appeared in his field of vision, he almost lost his footing before he realized that the implant was finally activated. He walked backwards and sat back down, reading the text.
-Diagnostic complete, all clear.
-Are you able to start the integration now? Yes/No (vocalize the answer)
Adrian swallowed and said out loud “Yes.”
-Sequence initiated.
-What is the color of the sky during the day as most commonly seen from Earth? (Vocalize the answer)
Adrian frowned at the question, but then decided that the implant was probably measuring his responses as a baseline so he answered. “Blue.” He spent the next twenty minutes answering all kinds of ridiculous questions. Until finally that part of the test was finished.
-Are you able to start the audible portion of the sequence? Yes/No (vocalize the answer)
“Yes.”
Suddenly he could feel a faint tone in his ears.
-Do you hear a sound? Yes/No (vocalize the answer)
“Yes.” He said, wondering how this part of the test will work. Suddenly the tone stopped and he heard three more tones one slightly fainter, one much louder but still tolerable, and then one so loud it hurt his ears, he immediately brought his hands over his ears but the tone was already gone by the time he covered his ears, though it’s not like would have helped. He read inside the manual that sound from the implant was transmitted directly to his brain, it only simulated it to appear as if it was coming from the outside. His ears reacted to what his brain thought he was hearing.
-Rank the four sounds on a scale of 1 to 10, going from the weakest to strongest:
-First sound-
-Second sound-
-Third sound-
-Fourth sound-
Adrian ranked them accordingly the he gave the first tone a rating of 3 the second 2, third 7 and the last a 10.
-Designate desired sound level for audible systems from 1 to 10.
Adrian said “Five”. Then the text disappeared and after a moment he heard a monotone voice speaking in his ears
“Greetings user, I am virtual help program, created in order to aid you with the configuration of your implant. The required pathways have been created, user commands now do not need to be vocalized, do you wish to continue?” Adrian jumped for a moment, looking around before realized that the voice came from the implant itself. He leaned back in his chair and started conversing with the program with his thoughts, tweaking his interface. He set commands for various actions, first he set activation command for his implant, if he “thought” at his implant, he would get a response, he could then access many submenus, he chose to have those appear as a text in the left side of his vision, and next he designated the command for clearing his “HUD”. Adrian quickly grew excited and started playing with the many combinations he had available, it was just like a game where he had the option of managing his own HUD. He could have a graph monitoring his heart rate or pressure, a clock saying time of day, a calendar, he even had the option of accessing the internet via his implant, or rather the Olympus equivalent Olnet, which every member of Olympus had access to, it could be used to send messages or talk with any other member of Olympus though now that he was on Mars there would be a delay with those on the Moon or Earth. But that didn’t stop him from sending a message to his friend Sahib who was now on Earth bragging about his implant. He wasn’t told that he was required to keep his implant a secret, only that the Ai was classified, so he figured that there is no harm. Next he downloaded all the data he had on his datapad into his implant, there were pictures, videos, games, which he found were much trickier to play on the implant (until he found out that there were patches that updated the game specifically for implants), books, and multitudes of textbooks. He found that now he could archive them in his implant and using a simple search engine find the information he is looking for. After two hours of playing with the interface he decided to clear his HUD of all but a small closed letter image in the corner of his vision that will alert him if he had any messages. After he told the help program that he was finished it asked if he wanted to activate his Ai. Adrian thought about it and when he didn’t see any reason why he shouldn’t, he gave his okay. At first he couldn’t sense anything different, then he remembered that the manual said he needs to initiate the Ai’s active mode.