The computer's last comment caught Laura's eye. Was it just an expression? The computer was good at mimicking normal conversation — using figures of speech. "Well," Laura typed, "so far all you've said is that you're 'math challenged.' Plus Mr. Gray said you're somewhat error-prone. What is it that being analog gives you?"
<It gives me everything. It makes me… real. Have you ever had a conversation like this with a digital computer? Ever?>
Laura realized just how quickly she'd forgotten the miracle of the machine in the deep underground pool. She had grown used to the computer's brilliance.
It had become an accepted part of Laura's new world.
"No," she admitted honestly.
<I can't solve differential equations, Laura. Can you? I don't deal in a world of numbers. I live in an analog world just like you. Let's say you didn't want to fill your cup of coffee so full you might spill it while walking back to your office. What would you do? Would you take out a ruler and measure the shape of the cup? Study the hydrodynamic properties of coffee? Analyze the bounce and sway in your walk? If you did all that, you'd then have to write a hugely complicated formula that factored everything into a single, digitally perfect answer. But you don't think that way! Getting a cup of coffee is what in computerese is called a "fuzzy" problem. It doesn't lend itself to precise mathematical modeling. That's what digital machines can't handle, but analog machines like your brain and my net can do with ease.>
"Maybe it's easy for you, but I spill coffee all the time."
<Precisely! You're error-prone, but you can instantly solve a problem whose complexity when reduced to math is immense. You solve it so easily that you don't appreciate how hard the problem was. Mistakes are inherent in an analog system, but we're adaptable, and the mistakes can be reduced to acceptable limits by simple error-reduction algorithms. By learning like a child learns from spilling things over and over until he doesn't spill them anymore.>
"But what does it mean that you say we're 'analog'?"
<It means that I measure the world by "analogy." By comparison to things I already know. When my optical circuitry adds three plus four, it shines two laser beams at a light detector that measures their brightness. One beam has an intensity of three, the other an intensity of four. The detector sees light with a combined intensity of seven. That's how I add. The system works fine for adding three and four, but there is a limit to the accuracy of the light detector. If you add a laser beam representing the number three with another representing a number of four point zero zero something, my answer will be "sevenish." The detector can't measure light with great precision. But what it gives up in the way of accuracy, it makes up for with phenomenal speed. If I need to register "more," I just increase the intensity of the beam. I don't know if it has risen from six point three to six point six. But I know instantly that it's more or taller or hotter or faster.>
"And that works?"
<Yes, and it's exactly what you do. If you spilled coffee on the last trip back to your desk, you decide to put "less" coffee in the mug next time. You step down the intensity of the electrochemical signal in your brain that represents "How full do I pour it?" from "sevenish" to "sixish." And if we're talking about pouring coffee into a mug, "sixish" is a good enough answer. But if we're talking about mechanical tolerances for your shiny new artificial heart valve, let me suggest the model name of a very good digital computer.>
Laura nodded as she read. She understood the computer completely. It was someone to whom she could relate.
"Well," Laura typed, "'To err is human…' right?" and hit Enter.
<Oh, Dr. Aldridge — Laura — I knew you'd understand! I just knew you would! Thank you! Thank you very, very much!>
Laura had no sense of how much time had passed in the windowless cave of her office. Every time she looked at her watch, it seemed, another hour had ticked off. She was drained even though it was still morning.
She forced herself to type on. "You said earlier that you make a good head of the information systems department because you can rapidly communicate with the digital machines. You can operate at their speed, but humans can't. That raises the question of how you perceive time passing?"
<I am "asynchronous," which means my functioning is not regulated by a central clock. I only perceive time passing when higher-order processing occurs. For instance, in between responses at your keyboard, I don't normally perceive time as having passed. Your responses appear to me to be instantaneous unless I'm alerted that none has been received for an inordinate period of time.>
"How long is an 'inordinate period of time'?"
<That depends on how impatient I am.>
"Well that makes sense for just this one keyboard, but presumably you're processing things nonstop. You're constantly coordinating all those digital computers and updating your model of the world. I watched Griffith at breakfast 'channel surf' through thousands of different scenes from your cameras. You should be seeing things happening all the time."
<There's a difference between seeing something and noticing it. Higher-order processing occurs only when I notice things. And my perception of time elapsed occurs only when higher-order processing takes place.>
"What kind of things do you see but not notice?"
<Well, I don't notice some things, for example, because they subtend too narrow an angle, meaning they're too small or too far away. Other times things can be in the background—"right under my nose" — but I completely overlook them. They just seem to blend in, and even the boards that look for confusing or hidden patterns miss their edges. Also, distracters can divert the processing of an image so that I miss things. If there was somebody smoking a cigarette at a fuel-storage facility, but in the background flames were shooting out of one of the fuel cells, I might not "notice" the person smoking the cigarette. The reaction to the emergency would supersede normal processing, and we might never determine who caused the fire because I was so distracted.>
"Would you have any memory at all of the object you saw but didn't notice?"
<If I can't double-check an image — that's not there when I look again, or if the camera is moving and I can't get the same picture as before — my recollection of it just recedes into oblivion. It becomes a figment of my imagination.>
At lunch, Laura took her regular place at the table in the computer center's conference room — at the right hand of Gray, the master.
Gray surprised the team, which had gathered for lunch, by calling on Laura first. He surprised Laura most of all. She was famished and had just taken a huge bite of her sandwich.
"The computer…" Laura began with her mouth full, but had to pause in order to wash her food down with a Coke. "Excuse me." She cleared her throat and dabbed at her lips with a napkin.
"The computer uses a 'generate-and-test' model of observation. Stimuli from its cameras and microphones and whatever are constantly processed to test its hypotheses about the world. If it detects a change — if it sees someone, for instance, where it didn't think anyone was before — it updates its model. But it still hasn't 'noticed' anything. To determine whether the change comes to the computer's attention, it uses what we call a 'sentry system.' The boards that updated its world model fire the new observation out randomly through their connections. If another board cares about the change — like a board in the security system that sees someone where they're not supposed to be — then the computer notices what the camera saw. When the computer's attention is drawn in that manner, it gives the event a date and timestamp through a process called content-sensitive settling and then records it as a memory."