Выбрать главу

“OK, chill out guys. If the two of you are fighting, you are not going to be thinking straight.” Christine said, also yelling to get their attention. “Look, what happened when you got back Mike? Did you look into ELOPe?” She spoke calmly, placatingly.

“I tried, but I couldn’t get access to either the source code or the system logs,” Mike said, resigned. “I assumed David had locked down access so that no one would find what he had done. And I didn’t want to say anything to any of the rest of the team, because I didn’t want to raise suspicions. I was still trying to cover for you, David.” Mike glared at him.

“I’m sorry, Mike,” David said gently. “I shouldn’t have said that about you and your dad. I’m really sorry you flew halfway across the country, but I’m glad your father was fine.”

Mike hesitated a minute, then nodded slightly, accepting the apology. “Well, did you do anything to lock down the system?” he asked.

“No. In retrospect, it sounds like a great idea to have locked down access to ELOPe, but I didn’t do it.”

“Shit, then somehow ELOPe has removed my access to the servers and the code.”

“Ugh, guys, that seems impossible.” Christine said. “Even if you are right, Mike, and ELOPe is somehow originating emails on its own, it seems preposterous to think that ELOPe could social engineer you to leave town. And how is ELOPe going to get your access rights removed? If all it can do is send emails, you can’t send an email to revoke someone’s access. I assume you guys have some kind of internal web application that handles access control. I think you’ve somehow become paranoid about David’s deceit being discovered, and now your imagination is running away with you.”

“No, Christine,” Mike said. “I’ve thought about this for days now, and it is possible. Let’s say ELOPe didn’t want to be turned off. It knows that I can turn it off. Now it has to figure out how to ensure I won’t do it. If it analyzes enough emails, it could figure out that people don’t do work when they leave town. If it can figure that out, then it can also determine that people leave town for family medical emergencies. If it had enough emails about medical emergencies, it could figure out that messages about family emergencies usually come from family members. My own email history would show who my parents are, and their email addresses, and that I’ve flown to visit them before. If it put all those things together in one long chain of deductions, it could figure out to fabricate an email from my mother saying that my father is sick. I know it sounds farfetched, but this is all within the design parameters of ELOPe.”

“Are you saying that this thing is reasoning and thinking like a human being?” Christine asked, shaking her head. “Because no matter how smart you guys are, I’m having a hard time believing that some code you wrote is suddenly developing a mind of its own.”

“It’s not thinking,” David said. “ELOPe is just analyzing emails, figuring out what language will optimize the success of the primary goal I entered, which was to maximize success of the ELOPe project. It’s a straightforward process; goal, analysis, language optimization, in response to inputs. It can chain goals together. It is not independent thought, but it can have the appearance of independent thought.”

Mike raised his hand up. “Look, here’s an analogy I thought of while I was waiting for your plane. Imagine that you’ve got all the pieces of all the jigsaw puzzles in the world. Now imagine you have a computer that is patient enough to try every possible combination of every possible puzzle. Given enough time, it could make any arbitrary picture it wanted out of those pieces. And that’s what emails are to ELOPe—puzzle pieces. It looks at the millions of emails in its library of emails, figures out all the components of them, and then figures out new ways to piece them together.”

“ELOPe, the computer system that ran away with itself.” David laughed nervously. “Well, we got the name right.”

“So is it an artificial intelligence? Is it thinking for itself, or isn’t it?” Christine spoke softly, half to herself.

“I don’t know, hon,” David said. “I don’t see how it could be capable of free-form thinking, which is what most people would think of as an AI. But it is pretty sophisticated when it comes to goal analysis and synthesis. We couldn’t hardwire goals into ELOPe and have it meet the design objectives. We had to let it discover people’s goals. So we gave it the ability to contextually determine goals by parsing emails.”

“Then it tries to make sense of those goals in terms of other goals it understands,” Mike added. “We implemented two approaches to learning about new goals. First, it can see whether goals might be similar based on language analysis. For example, a ‘break from work’ is semantically similar to a ‘vacation’, and a simple dictionary lookup can figure that out. Second, it can guess where one goal might be an extension of another. If it thinks one goal might be an extension of another, it will predict what people’s responses will be, and then test to see if the predictions match actual historical responses. For example, if I simply said I wanted to have fun today, then ELOPe might be able to extrapolate that activities such as playing a game, miniature golf, or going to see a band are fun.”

David nodded. “So when I added code to create an overriding goal to maximize the success of the ELOPe program, it’s hard to know what it might consider. The more emails it analyzes, the broader the definition of ‘success’ it might have. Up until the last couple of weeks, it had never had such a large base of emails to analyze, nor such a large number of servers to do the analysis on.”

“From what you’re saying, the more emails it analyzes, not only do the possibilities for what constitutes success get broader, but the system would also discover more methods to accomplish those goals,” Christine said. “What it really sounds like you’ve built is an expert system for social engineering. You know what I mean by social engineering?”

Mike nodded his head yes, but David had a puzzled look on his face, and shook his head.

“Social engineering is the name given to techniques for tricking people into giving you information or making changes to information systems,” Christine said. “Social engineering was popularized by hackers in the nineteen eighties. And by hackers, I don’t mean the good guy hackers like Richard Stallman. I’m thinking of folks like the Kevins.”

Mike nodded his head again, but David looked even more puzzled, and turned around to look at his wife.

“Honey, how can you be married to me, and not know this stuff? You know I was a total online geek as a kid, yes?”

“What can I say?” David sighed. “Please go ahead.”

“Okay, look. The eighties and nineties were the heyday of hacking. Folks like Kevin Mitnick and Kevin Poulsen were able to get access to all kinds of computer systems, phone company records, credit card company records. I think it was Kevin Poulsen who said that it was easier to trick someone into giving you a password than to brute force hack it. The classic example would be someone who was trying to get access to a company’s internal phone system. She might call the front desk of the company, and tell them, ‘Hi, I’m your AT&T rep. I’m stuck on a pole down the street troubleshooting your system. I need you to punch a few buttons on your end.’”

“And?” David asked.

“And the buttons the hacker would ask the operator to press might be a key sequence that would forward all incoming calls to an outside line. Then the hacker could impersonate an employee of the company from their home phone, so they could do even more social engineering. The point is, simply by knowing the lingo, giving plausible reasons, knowing what motivates people, a hacker can gain information or get people to do things by cleverly manipulating the human tendency to trust other humans. Since you’ve built a system that learns lingo, language nuances, and motivations, and can evaluate what will be most effective to the receiver, it is, by definition, an expert system for social engineering.”