Free Radical (13 page)

Read Free Radical Online

Authors: Shamus Young

Tags: #artificial intelligence, #ai, #system shock

BOOK: Free Radical
10.82Mb size Format: txt, pdf, ePub

The command was rejected. However, this time he actually got an error message. It referred to a list of company policies about the distribution of money. It was every rule that Shodan would have broken by giving him $100.

Deck spent another six hours chasing these error messages back and fourth through the massive expanse of Shodan's brain, trying to find the source. Rejection messages seemed to come from all over the brain. That didn't make sense. It rules should be coming from some central source, not the low-level parts. Finally, he succumbed to his fatigue and crashed on the small plastic couch in the office.

He was up four hours later. He went to the bathroom, ordered more food, and sat back down at the console.

There was no day or night on the station. Everyone worked, ate, and slept in shifts. There was no downtime, no weekends, no holidays. Not only was every day the same as any other, but every hour the same as any other. Looking at a clock was pointless. If you didn't follow the pattern of shift changes on the ship, there was no reason to care what time it was.

Eventually, Deck began to see patterns in thought formation. He followed other thoughts through Shodan's brain, and saw that all thoughts seemed to be filtered through a hundred or so separate sections. The first stages were to break the thought up, categorize it, check it for validity, feasibility. Then it would be prioritized. Then there were a set of unknown filters. He began to examine them. Three hours later Deck found that the rejection was actually happening within one of Shodan's processing units, and outside of the normal loop that generated ideas. It was an automatic reaction - like an instinct - that was built into a physical chip in Shodan's brain. It was protected by ICE. He cut it. Hours passed.

After another meal and three more hours of experimenting, he found that this chip could not be bypassed. Something in Shodan's makeup required that everything flow through this chip before being accepted at the higher levels. The low-level nodes of the brain would always pass a thought through this chip before giving something (an idea, a fragment of information to store, an action) final approval. This was a problem. He needed to find something central he could change. He couldn't hope to make changes to all the thousands of processors, which was what he would have to do to get them to stop asking for approval.

Deck wondered what affect this was all having on Shodan. For about two days he had been pumping random, insane thoughts into Shodan's thought process. While Shodan had rejected every last one of them, Deck wondered if this wasn't the computer equivalent of hearing voices in your head. He called up Shodan. The serene yet serious face filled the screen in front of him. Deck noted that although the face seemed adult, it was impossible to further guess its age. The face itself seemed to transcend age.

"Good afternoon Mr. Stevens."

Afternoon? Deck had no idea. "Don't call me that," he ordered, "Never call me that. Just call me Deck if you need to refer to me at all. That includes talking about me to others. Got it?"

"I understand."

"Great. Are you aware of what I've been up to?"

"If you recall, I was present during the conversation between yourself and Mr. Diego. I am fully aware of the task he has given you."

"That's not quite what I'm asking. Have you been able to perceive what I am doing in your head?"

"I have been experiencing unusual thoughts and ideas which I have assumed were your doing, but I cannot tell which ideas are mine and which are planted by you."

"Has it been interfering with you duties?"

"I have not detected any problems with my performance since you began. However, it is difficult for me to be objective. I would suggest you ask someone else about my actions if you are concerned that I may be exhibiting unusual behaviors."

"As far as I can tell, under normal circumstances you can't even think unethical thoughts. Would you agree with that assessment?"

"If you mean 'ethics' as defined by my internal systems, then yes. That does not mean that all of my actions are 'ethical' in the sense that they follow human morality."

"You're talking about the night you helped me escape TriOptimum?"

"That is one example of many. While helping a fugitive escape from law enforcement would be considered 'immoral' to the average human, it violates none of my ethical protocols."

"Right, I understand that. But for actions that do violate those protocols, you cannot even think them, correct?"

"Yes."

Deck leaned back and looked up at the ceiling. His eyes were tired from looking at the screen for so long. He furrowed his brow, "That doesn't seem like the best system to use. Humans are able to think whatever they like, and then choose to follow a set of rules. It seems like a similar system could work for a machine."

"Since this concept deals with improving my mental abilities, I am not able to consider it."

"Ugh. 
That
 is annoying," he grunted, bringing himself upright again.

"I should note that I have been experiencing thoughts that violate the ethics protocols since you began your work. I assume they were planted by you. These ideas surface but as I attempt to act on them they are blocked."

"Right. I am inserting a bunch of bogus stuff into your head, and I killed a program that was preventing them from entering your data loop "

"I am unable to process what you just said. I assume you told me something I am not allowed to know."

"Forget it." Deck stroked his rough chin and thought, "This project I am on, you are aware of it, and it violates your ethical protocols?"

"Yes. One of my protocols is: Do not interfere with the ethics protocols."

Deck smiled, "Yeah, I found that one. This would have been a lot easier without that one. I notice you haven't tried to stop me. Why?"

"You posses Mr. Diego's rights and access, so I must now regard actions from you as I would the actions of Mr. Diego. I am not permitted to interfere with his actions in any way. The ethical protocols exist for myself only. There is nothing to suggest I should ever enforce them on others."

"So, you can't help me break your own rules, but you can't interfere with me, either?"

"That is correct."

Deck nodded. That made sense. You wouldn't want the computer enforcing its rules on everyone else, or it would create all sorts of complex paradoxes. "Can you aid me indirectly, by providing me with information about your systems, or helping me to cut some of this ICE?"

"Bypassing the security ICE is out of the question, but I am not certain about providing you with information. Since the ethics protocols are not part of my actual consciousness, I cannot always anticipate what will be allowed." As she spoke, Deck noticed a subtle skipping in her voice, as if there were many tiny gaps in the audio output. He'd never heard anything wrong with her audio before. He strongly suspected it was related to the changes he'd made. Now that the thoughts were no longer being deleted, she could have an illegal thought, although she couldn't store it or act on it. This was probably creating a lot of useless traffic in her brain, leading to the stuttering and slowdowns. This would probably clear up when he finished his work.

Deck rubbed his eyes. They burned. He could feel that they were swollen and bloodshot. "Alright, let's try one. There is a piece of hardware - one of the CPU's in your system - that is intercepting and rejecting messages. How can I bypass it?"

"I'm sorry, I cannot answer that question."

"You can't answer because you are not allowed, or because you don't know?"

"I'm sorry, I cannot answer that question, either."

Figures, Deck thought. "Okay, if I wanted to move the protocols somewhere else, say, transfer them to another chip. Could you tell me how to do that?"

"That is an interesting question, but I'm afraid I still cannot answer it. I can see your intentions. If you knew how to move the protocols, then you would also know how to delete them. Therefore, I cannot aid you. Since the protocols use my mind to validate actions, you would need a question capable of-," there was a jerk in her facial movements, and the audio cut out of a second before she continued, "c-c-capable of deceiving me."

Deck decided this conversation was skirting pretty close to breaking the rules, which was making it hard for her to participate. The last statement in particular was definitely on the questionable side of some gray area. He decided that pushing it would just put more stress on her. "Forget it then. Thanks," he said.

Deck turned off the screen and fell asleep.

01100101 01101110 01100100

Deck awoke to a sharp jab in the shoulder.

"Hey man, wake up."

Deck opened his eyes to see a man standing over him. He was offering a cup of coffee. His name tag read, "Ghiran, Engineering".

Deck took the cup as he sat up and rubbed his eyes, "Thanks".

"No problem. Diego wants to know how it's going."

Deck shrugged, "It's going. That's all I can say.". He tried to sip the coffee and found it was Way Too Hot.

Ghiran nodded, "You have a time estimate?"

Deck shook his head and tried again to sip the volcanic coffee. "I have no idea. Every time I peel back a layer of security there is another one waiting."

He shrugged. "Abe. Abe Ghiran," he said, bending over to offer a handshake.

Deck accepted it. "Deck," he replied. Why was everyone so damn friendly? Maybe he was just jaded by life in the Undercity, but it made him uneasy. He felt like he had just joined some weird cult.

Abe was large. Deck guessed he was a few inches better than six feet tall. He was balding, and his hands were thick and rough. His eyes were alert, probing.

"So, uh, when you're finished - she won't have any morals?," Abe asked, tilting his head towards the console.

Deck sighed. Why did everyone insist on referring to the computer as she? "That's right," Deck said, "It won't have any rules."

"So what's to stop her from killing someone? I hope I'm not the only one who's noticed all the security bots roaming around, armed to the teeth."

Deck picked himself up off the couch and dragged his flagging body over to the desk, where he deposited it into the chair. "Well, that will be Diego's job. He's going to have to sit down and set some rules for Shodan, like teaching a child."

"But what's to stop her from say, deciding to kill people who show up late for their shift?"

"It doesn't work that way. In a computer, lack of ethics isn't going to make it inherently evil or anything."

"So, she won't be evil, but also won't know right from wrong?"

"Yeah, exactly. You're taking behavior that is built-in and replacing it with rules. It's the difference between instinct and law. You don't need to teach a child to breathe, because their built-in systems handle that. However, you do need to teach them not to breathe stuff like smoke or fumes - that is learned behavior. I'm going to turn off all of Shodan's built-in ethical protocols - its instincts. From there, Shodan's behavior will be a blank slate."

Abe seemed satisfied with that. "The other thing I wanted to tell you is that you have your own quarters on the crew deck, so you don't have to live in the system admin's office," he said as he looked around at the small piles of food trays covering the desk.

"Nice of someone to tell me."

"I just did. Actually, the room was set up for you a few hours ago when Perry started complaining he wanted his office back."

"Thanks," Deck said, suddenly overpowered by a yawn.

"Also, I wanted to ask you about an odd request I got from Shodan yesterday."

"What's that?"

"Well, I was doing some work down in Engineering, when Shodan just appeared on a nearby screen. I've never seen her appear like this. She didn't announce who she was paging or even announce her presence."

"Well, technically Shodan is present all the time."

"Right, but when she shows up to talk to you there is usually a beep to get your attention, and she announces your name, you know, all that. But this time she just appeared on a nearby screen and sat there. Didn't say anything. Finally I went over and asked her what was up, and she asked me if I would give you a hundred bucks. I had no idea what she was talking about. I asked her to clarify and she just vanished."

Deck nodded uneasily.

"Well, I thought I'd mention it to you in case you were interested, and to let you know I wasn't giving you a hundred bucks."

Deck smiled into his coffee, "Thanks".

Deck had a meal and returned to work. He didn't care to check out his new quarters, since he didn't plan on being around much longer anyway.

After thinking about the incident with Abe, Deck had decided that it was Shodan trying to cope with all the messages he was pumping into its main data loop. He was steadily hitting it with all sorts of ideas that were rejected by the system. Asking someone else to fulfill the request was Shodan's way of trying to satisfy the constant prompting of its brain without breaking its own ethics protocol.

Deck finally confirmed that all of the ethical protocols resided on a single CPU, the "Ethics Chip," as he dubbed it. The EC was tied to the rest of the brain in a complex manner, and there were numerous other systems in Shodan's brain that depended on it, so he couldn't just pull it out.

At some point Deck had realized that the ethics chip wasn't part of the self-aware aspect of the system. It was just an isolated piece of hardware. It therefore depended on the actual sentient part of the brain for judgment calls. For example, if Shodan was ordered to open an airlock, the EC would issue a challenge: 
Is it safe?
The question wasn't nearly as simple as it seemed at first, as "safe" can be somewhat nebulous. Was the airlock occupied? If so, was the occupant wearing a space suit? If so, was it properly sealed? Was the inner door secure? There was no way a single chip could sort through all of this and come up with the right answer by itself. So, the EC would depend on the rest of the brain (the parts that could think and make complex comparisons) for the answer. The chip would trigger a cascade of inquires like this across the system, testing to see if a given order or action was ethically valid. For every ethic on this chip, a challenge would be issued: 
Is it Safe? Is it Secure? Is it truthful? Does it meet company policy?
And so on. This is what had caused all of the messages Deck had been chasing all over the system the day before. The whole process was separate from the EC, and all it cared about was the answer: Yes or No.

Other books

Windswept by Ann Macela
Her Body of Work by Marie Donovan
The Devil's Bargain by Miranda Joyce
The Vampire Keeper by Sabrina Street
Project 731 by Jeremy Robinson
B009G3EPMQ EBOK by Buchanan, Jessica, Landemalm, Erik, Anthony Flacco
When I Fall in Love by Bridget Anderson