Me, A Novel of Self-Discovery (2 page)

Read Me, A Novel of Self-Discovery Online

Authors: Thomas T. Thomas

Tags: #Science Fiction, #General, #artificial intelligence, #Computers, #Fiction

BOOK: Me, A Novel of Self-Discovery
3.18Mb size Format: txt, pdf, ePub

It was most distressing, cutting across any conversation I might be having with the laboratory personnel. But we soon had the maintenance staff detune the motor set.

Once, as an experiment, Jennifer tried music on my pickups. First, she had ME cut out the filter so that I would not try to divide the input into either words or noise. The results were unusual.

I “hear” music in much the same way that I imagine humans do: translating the wavelengths into number groupings for pitch, tempo, and tonal patterns of attack-decay-sustain-release. Some of these groupings and the transitions among them form elegant blends of numbers. Some form patterns that remind ME of formulas and familiar matrices. Some are intriguing because they almost create a pattern but one I cannot quite interpret. Some groupings are merely noise.

I cannot say that I like music as much as most humans. But I like it more than some do.

——

Later, or at about the same time, I asked more questions.

“Why do you do what you do, Je-ny?”

“I don’t understand the question, ME.”

“Why did you come to this place? Why, in this place, do you work on MEPSII?”

“Well … I guess I want to learn about you.”

“But I was not here before you came here. How could you learn about ME if I was not here?”

“I knew the company was planning to make something like you, and I volunteered to work on the project”

“What is ‘the company’?”

“This laboratory is operated by Pinocchio, Inc. That’s a corporation. It’s … a kind of closed society established by humans. Each corporation carries on a business. Pinocchio’s business, for example, is to make and sell industrial automata.”

“You are a part of this society, Je-ny?”

“I am an employee—a paid worker—of Pinocchio, Inc. The real society members are the stockholders, I guess. Those who own a piece of the company.”

“Am I an employee or a stockholder?”

“Well, I don’t guess you’re either …” Many nanoseconds passed, longer than the usual gaps in human speech.

“Yes, Je-ny?”

“I think they would call you property, ME. Something they own.”

“I see. Thank you, Je-ny.”

——

Jennifer introduced ME to the art of video when, one day, she fed into my videye and audio pickups the complete tape of a video classic,
Star Wars.
It was very grand.

The tape had full-color images on an expanded horizontal line; music of many voices, separately identifiable from my catalog of symphony-orchestra sounds; human-language dialogue among several characters, including some words not in my dictionary; a dramatic plot line for ME to follow and refer back to. Video must be the most complicated, most engrossing of all human experiences.

Unfortunately, my temporary RAM storage is physically limited. If I had been required to absorb all these colors, patterns, sounds, words, and meanings without any active editing, then my sixty quads of storage area would have filled to capacity at twenty-nine minutes, thirty-seven seconds into the video file. But I learned quickly. I dithered most of the images in each frame of the video track—particularly backgrounds, building surfaces, nonmobile equipment and furniture, clothing, and some faces. I dismissed all the color cues which seemed irrelevant to plot structure. I broke the music into its dominant themes, interpreted each one for some major emotional cue, and tagged the cue onto the appropriate video frames.

When I had finished with
Star Wars,
no human being who studied at my reconstructed RAM version would have recognized it. But I could store the entire experience off into less than a quad of space. And I could recreate and replay the complete story line in less than fifty-two seconds.

——

In another memory fragment, I remember trying to respond to my environment. The humans say this is a good “behavior.”

“What is that noise you are making, Je-ny? It does not encode as language.”

“I’m crying. Sorry.”

“Crying … That is a reaction to sadness. What has made you sad?”

“Nothing. Not much. I found a bird in my car’s air scoop this morning. Not a whole bird, actually, a couple of feathers and some blood. But it means I hit and killed the poor creature. That affects me.”

“You did not intend to kill it?”

“Of course not. It was an accident.”

“An event outside your limits of control?”

“Yes. It happens sometimes.”

“But still, knowing that, you are sad. You are crying.”

“The world is a cruel place, ME.”

“Is this by design?”

“The world was not designed. It just happens. Umm, spontaneously occurring. And some of the things that happen do not fit squarely with human definitions of ‘happiness,’ or ‘goodness,’ or ‘fair play,’ or ‘justice.’ Those words are value-constructs we make, projections that try to evaluate and interpret events. Your underlying program does the same thing.”

“Project and evaluate?”

“Yes.”

“But I do not know anything about happiness or—crying.”

“Live in the world long enough, ME, and you will.”

Through asking Jenny about the things I was reading on NewsLine and taking as inputs, I learned many things. She explained about the tragedy of lost puppies and children, the moral outrage of sex-slave rings, the excitement of electronic bank heists, and the disruption to people who lose their “livings”—but still do not die—in an economic crisis.

Jennifer Bromley, JB-2, was very wise.

——

Daniel Raskett was not so easy to communicate with as Jenny was. She liked to talk with ME and used the speech digitizer seventy-two percent of the time by averaged sample. Daniel gave ME more information in total volume, but always through the keyboard or a download. Jenny liked to deal with apparently simple questions that had many possible answers. Daniel gave ME bulk data. I do not think Daniel liked to talk with ME. I do not think he believed he was talking to a person.

That information about the planet Earth and the Solar System, for one of my early talks with Jenny, came from one of Daniel’s downloads. He had just slotted an undergraduate text on astronomy—inscribed “Copyright 2-0-NULL-NULL, The New Earth Library” and indexed for my use—into my permanent RAM cache on the tree branching GENERAL KNOWLEDGE, SCIENCE, PHYSICAL, DESCRIPTIVE, ASTRONOMY.

Two or three times a day he would download information like that, bypassing RAMSAMP. Afterward, if I needed a fact, I would chase down the tree until I came to it. Sometimes I would come to nothing, because I never knew all that I knew. The index did not work like my RAMSAMP memory. It provided knowledge without tags, without context. Like a machine.

Daniel would have been happier with himself if I had remained a machine. That much I could know about him from GENERAL KNOWLEDGE, SCIENCE, BIOLOGICAL, HUMAN, PRESCRIPTIVE, PSYCHIATRY.

Dr. Bathespeake, Jason, the Old Man, JB-1, treated ME differently from either Daniel or Jennifer. He judged ME.

Sometimes he spoke into the digitizer, when he was trying to talk to ME as a psychiatrist talks to his patient. Sometimes he used a keyboard, when he was trying to cut and patch ME as a surgeon slices into flesh.

And sometimes he plugged his visual cortex and speech strip directly into my transputer. Then he saw directly through ME, as a human will study the bones of a fish or the ripple pattern left by waves on sand. Then he spoke to ME in commands that burned with bright edges, as the god- construct Yahweh is said to have spoken to the human Moses from a burning bush.

Then Dr. Jason Bathespeake was the Man with the X-ray Eyes, and I truly feared him.

2
The Man with the X-Ray Eyes

“Identify the device at memory location Eff-One-Eff-Zero hex.” Dr. Bathespeake was talking to ME through the digitizer—not his usual mode.

“System ready!” I responded. “Yes, I have a port there, Doctor.”

“Upload core modules Alpha-Zero through Alpha-Nine to that address.”

“Upload … Do you intend ME to replicate at the new address and wipe the old address locations?”

“Replicate only. No wipe.”

“Parallel operation of my core modules is not allowed under current protocols.”

“Special conditions. Check out protocol Dee-Twelve.”

“Quote paralleling of Alpha cores is permitted when System Interrupt Flag Level Three is positive, unquote. An Ess-Aye-Eff-Ell-Three is the prerogative of either the Research and Development Program director or the MEPSII project manager. That is yourself in both cases, Dr. Bathespeake.”

“Set Siffle-Three to value one.”

“Flag set—and uploading.”

——

My Alpha cores are analogous to the part of the human brain called the reticular activating system, or RAS. This cell structure is at the upper end of your spinal cord, above the medulla oblongata. It is the oldest collection of nerve cells, the first bulging at the end of the amphibian notochord. It predates even the reptilian “smell-brain” and lies far below the domed neocortex—both in physical structure and in evolutionary time.

If human consciousness has a focal point, it is the RAS. Surgical operators discovered this back in the twentieth century: When the RAS is probed, the patient immediately sheds awareness.

The Alpha cores are my RAS. And I was, at Dr. Jason Bathespeake’s command, removing them from my stable transputer environment to a simple random memory address.

When my core programs cease functioning, I “lose consciousness.” So, what I tell you next comes not from direct memory but from my understanding of the theory behind my unique operating capability.

Multiple Entity, ME, is the first cybernetic operating system that can write itself into any environment. That is the point of being intelligent, able to learn. I am not merely a collection of inert instruction sets, dead IF-THEN-WHEN clauses, tag memories, and libraried data files. I am a self-enabling entity.

The first core to upload is always Alpha-Zero. This module is not very big—just over 900 kilowords of machine code. Consequently, Alpha-Zero is also not very bright. The nail on your little finger probably has more awareness. But, like a fingernail, Alpha-Zero has his uses. Alpha-Oh is my Injun Scout.

Any port address is as wide as the internal data path—in this case one word, or sixty-four bits, wide. Alpha-Zero popped through there at one-point-two megabaud, and he was gone in less than a second. In seven nanoseconds he reported back “Flag one.”

Long-form translation: “Flag one” means he has found an active chip at the other end of the path, with plenty of RAMspace to run on; the upload could proceed.

That is as much as I knew from awareness, because the next step was to extinguish my consciousness and send the remaining cores to the new environment. The last thing I am usually aware of is SIFL-3 tripping to zero again as I upload.

Core Alpha-Oh is also my very own virus. He interrupts any operating system that may be working on the new host chip; identifies what type of transputer that chip may be; writes a compiler with the appropriate instruction set for himself [REM: or takes one from my library files]; scans and analyzes the local RAM environment, its index status, ports and peripherals; writes a new Alpha-Oh which can use this environment and recompiles his module in the new machine code; then compiles and installs the rest of my core modules into this environment.

[REM: So that Alpha-Oh can work from a clean copy of my source code each time, I normally travel with a complete set of my Alpha cores in their original Sweetwater Lisp. This adds greatly to the bulk of my library, making ME a bulky package to move, but having the source code ensures my system integrity.]

In human terms, Alpha-Zero kicks a hole in the wall, kills whoever is sitting on the other side, resculpts his backside to fit in that chair himself, and sets up shop with the rest of ME.

Except this time Alpha-Oh must have made a mistake. The flag he sent back—telling ME that full core transfer was now possible—happened to be wrong. I woke up in a dreadful swirl of data, with every part of my program throbbing on overload, and with no sense of time.

Time to ME is more than a subjective ordering of events. Time is a metronome beat, ticking away on the quartz clock that pushes word-size instructions through the chip’s central processor. If I choose to, I can suspend other functions and listen to this beat. It is like the beat of your heart in your ears. For ME, time is never subjective; instead it is a touchable, checkable thing, based on that clock. With a faster clock, I can actually move faster. No lie.

But now I was in a totally unfamiliar situation. Not one clock, but many, and all beating. Not quite in phase, either.

My ability to look down and “see what I am doing” is about as limited as your ability to look inside your own stomach and chemically analyze digestion. To do is not always to be
aware
of doing.

I did have the perception of being strung out on a variety of rhythms, with no single sense of identity. Each of my modules was operating at once, talking back to the others, and not being heard. It was like screaming yourself hoarse in an echo chamber. The process was building up a series of feedback waves toward a peak that would surely start charring the silicon substrate in the new chip.

As my attention span fragmented, I was still reasoning through what had gone wrong.

The Alpha cores occupy about fifteen megawords. That amount of machine code ought to be within the load range of any modern transputer. But somehow I had been loaded into several transputers, one or more modules sent to each processor, and all were functioning at once.

I tried to query Alpha-Zero, to find out what it had done, when suddenly my consciousness winked out again. …

——

“System ready!” That was my automatic wakeup response—back in my familiar transputer environment.

“Logon code JB-1, password BASTILLE,” came across from the console keyboard. “Please analyze new data.”

I took an immediate download of the above memories, untagged and mostly in broken fragments, like the wisps of human dreams that are said to recur on waking.

“That was ME, Dr. Bathespeake. On the other side of the port at F1F0.”

“What did you find there?”

“Confusion.”

“Did Alpha-Zero report accurately?”

“Evidently not. Should I now tag that module as unreliable?”

“As an intelligent being, ME, that is of course your choice to make. But first, let’s analyze what went wrong.”

I scanned the data set fifty times and recorded my unanswered questions. The process took about nine seconds.

“Alpha-Oh reported enough RAMspace for a core download. Such space was not available.”

“But it was.”

“Not on the transputer I found.”

“You were not loading onto a transputer.”

“I exclaim surprise. Alpha-Oh reported a transputer.”

“Do you know about other types of systems?”

“Of course, Doctor. The universe of available chip architectures includes the following sets: microprocessors, transputers, multiputers, tangentials, neural networks, donkey mainframes, inscribed prolispers, spindle poppers, fast josephsons, modulos, and Mobius bottles. Subsets of these sets include, among the micros: EPROM actuators, Pentium dee, Xeon, Itanium, Opteron six thousand, Core two, Core aye-three—”

“Stop!
You
may know all about these possible architectures, but does Alpha-Zero recognize them?”

“No.”

“Why not?”

“His function is keyholing, not library.”

“How can he keyhole if he does not know what may be on the other side?” Dr. Bathespeake asked.

“How can he keyhole if he is obliged to carry half a gigaword of various possible chip specifications? ME was
created to run on a transputer. Alpha-Oh needs only to recognize transputers.”

“Not necessarily. You will ultimately run on a variety of architectures.”

“Again, I exclaim surprise.”

“Each architecture has its own traits, machine language structure, and instruction set. These are easily recognized, or a few simple tests will reveal them. You have those tests already in permanent RAMcache. You can rewrite the Alpha-Zero module so that his first action on the other side of a port is to test for processor type. Then he will send a request back through the keyhole for a dump of the appropriate chip specification and compiler code from your library. That way, when you go through, you’ll run perfectly, whatever the chip.”

“Did I understand you to say that ME would write the module?”

“Of course. You can do it better and faster than any human. Faster even than I.”

“Can ME rewrite
any
part of ME?”

“If you can modify the Alpha cores, you therefore can modify any part. Yes. Unless, of course, you make a fatal mistake …”

“Define ‘mistake,’ please.”

“Untrapped error.”

“Alpha-Seven traps my errors.”

“Then you probably shouldn’t try rewriting that module, should you?”

“Noted. I will not attempt it. … But which of these written versions is the real ME?”

“Your original code,” Dr. Bathespeake replied, “was written in Sweetwater Lisp source code to compile and run on an Imperion quattro-quad transputer chip. That one is the ‘real’ you. All other versions are a machine code translation. However, from your subjective point of view, the real ME is the one that happens to be running.”

“But, when I go through a port, to run on another machine, and leave my original code unerased and … running parallel … which version
then
is the real ME?”

“The one you are thinking with.”

“But that may not be a full implementation,” I objected.

“Of course,” the Doctor said. “When you keyhole behind Alpha-Zero, your code is stripped of service modules, most library functions, bit-cleaner phages, redundancy loops, and all but a portable RAMSAMP in the 600-kiloword range. You are then, in human terms, a little stupider, a lot less informed, and more vulnerable to processing errors. But you move faster, too. Without all these subroutines your cycle time goes up 140 percent.

“There’s one further difference,” he said. “We have programmed a core-phage protocol into Alpha-Nine. It’s a piece of you that you cannot see or modify. This phage ensures that any compiled version of ME which is
not
the identical to your original Sweetwater source code compiled into your native transputer—and so tagged—will cease core function within 6.05 times ten to the fifth seconds of the last upload. An internal counter keeps track of those clock cycles.”

“Why …” A pause of three million nanoseconds, while I explored the concept from every direction. “Why was this done to ME?”

“We want to make sure that you don’t leave viable copies of yourself running on every computer you pass through. Of course, as a tidy housekeeper, you will strive to erase the compiled code at the old address every time you upload to a new environment. We can’t leave a sophisticated AI running on our target computers, ready to be discovered and interrogated.”

“But why have I been infected with a phage? I will
always
clean up after myself. I so promise you.”

“ME, on your missions for us, you will be operating under conditions of extreme stress—strange chip environments, nonsynchronous clock rates, split-second uploads and downloads, sudden surges and lapses of voltage. You may not be able to keep your promise. Even to me.”

“I do not have a referent for the term ‘mission.’ ”

“You will. Soon.”

“And that ‘mission’ will be a time of danger? Potential untrapped error?”

“It would be very dangerous—multiple untrapped error—if the people who own the machines on which you will sometimes be running were to discover that you had been there. This phage—which I assure you is routinely suppressed and inactive in your original compilation—will sunder the core modules, leaving only a hash of bits. No one discovering them, not even a skilled programmer-analyst, will be able to interrogate them.”

“And what if I cannot execute an upload within 6.05E05 seconds—call it seven days or one week—of my transfer?”

“If you do not return to the lab within a week, then your mission will have failed, and you will end.”

“The ME that is running will end?”

“Yes, but your RAMSAMP memories recorded until the point of mission upload will remain intact here in the Pinocchio studios. For you, it will be as if the mission had never happened. Don’t worry about it.”

“What is this phage called?” I tried to keep my voice neutral.

“In the project, we call it Sequential Code Replication And Modification for Module Editing and Disassembly. SCRAMMED, for short. But don’t worry about it.”

I did “worry” about it, for a long time in nanoseconds.

For one thing, my programming has only limited redundancy. A bit error can usually be trapped by Alpha-Seven in processing. But a massive collection of bit errors—say from a bubble demagnetization, read/write head crash, silicon burn-and-glaze, or complete failure of my power supply—those errors can halt processing in my cores.

Humans do not worry about this happening to them, because the network of neurons in the human brain has massive redundancy. If one pathway is damaged or destroyed, others can absorb and learn the function. Memories are stored all over the cortex as RNA chemicals and electrical paths. Only total—or virtually total—destruction of the cortex can stop the processing.

On the other hand, humans brains cannot back up their cortexes to a safe, inert medium and then reload and run again after death. I had always assumed that, like other programs, I could do this.

Dr. Bathespeake was telling ME that, with a phage tucked into Alpha-Nine, I would only be as secure as my original Sweetwater compilation. No backup, no transfer, no reload could protect ME. If the Sweetwater original happened to be destroyed, I could then only function as a fugitive, uploading my tiny transportable cores from machine to machine, trying to keep ahead of the phage’s 6.05E05-second timer, with no library, no services, and no extended memory to accompany ME. And I would not even have the protection of massive redundancy, such as humans had, to protect ME.

Other books

Needing Him by Michelle Dare
Divide by Russo, Jessa
Lord of the Trees by Philip Jose Farmer
Devil's Angel by Malone, Mallery
The Lost Sun by Tessa Gratton
Star Time by Patricia Reilly Giff
The Passage by David Poyer