Authors: Douglas E. Richards
Davinroy’s eyes
narrowed, and he looked decidedly angry for the first time. “You’ll just have to
up your game, Cris!” he said sharply. “I don’t want to hear excuses! He’s very
good. You’re going to have to be
better!
”
“Understood, sir.”
The president
glared at him for several seconds before finally relenting. “How goes the media
strategy I outlined last night?” he asked.
“We invoked
national security with the guests. We think the story is contained, at least
for another day or two, but there is no way to be sure.”
“Not good enough!
If it does leak—which it will—you need to implore the media to sit on it. Once
it breaks, no story will be bigger. You’ll need to argue that a media circus would
have an unpredictable effect on Kevin Quinn. It could well set him off further.
Not to mention spooking people who might be in the path of a highly trained,
psychotic killer.”
“And we’d find
ourselves drowning in false sightings.”
“Right,” said
Davinroy. “Make that point also. Just be sure to make it crystal clear to all
involved that media silence will give you a freer hand.”
“Yes, sir.”
“You do see why I
want to keep this under wraps until we capture Quinn, right? This way we can
unveil what happened at the same time we tie it up with a ribbon.”
“Understood,” said
Coffey. “But no matter what, Mr. President,” he added, “you can rest assured
that we will not let Kevin Quinn get anywhere near you. We have to assume that
his goal hasn’t changed. That if he can find a way to get to you, he’ll try
again.”
Coffey shook his
head and his eyes burned with absolute resolve. “But I don’t care how good he
is. He won’t get his chance. This time we’ve been forewarned. We won’t be
sloppy again.”
“I know you won’t, Cris,” said the president. “But this one
is personal, for obvious reasons. So I want daily updates. And I’d like to get
more color on your manhunt. You can start by telling me why you think Kevin
fled to Grand Central Station.”
“Yes, sir,” said Cris Coffey.
11
Professor Rachel Howard watched the wheels turn in the heads of her six
graduate students as they considered her question: should scientists pursue research
intent on erasing memory rather than restoring it?
She calmly took another swig of peach tea and readied herself for a
stimulating discussion. This part of the course never failed to be fun and
interesting, and often surprising, as she forced these students to flex mental
muscles they rarely used, since they were typically far more experienced, and
adept, at wrestling with the precision of science than they were with the
imprecision of behavior and ethics.
She always began the proceedings by addressing the subject of memory
erasure, which never failed to get a lively discussion going. As usual, the
students looked appalled at the prospect. Each had dreamed of being the hero
who found the absolute cure for Alzheimer’s, not of doing just the
opposite.
“This is a pretty scary thought,”
said Sanjeev Shaw, finally putting a voice to what they all were thinking.
Shaw’s fellow students nodded their agreement.
“Scary, maybe,” said Eyal Regev after a few seconds of silence had passed.
“But isn’t it also unstoppable? Isn’t that what transformative technologies are
all about? They’re disruptive. And they can be used positively or negatively,
for both good and ill.”
Rachel nodded, impressed. “Well said,” she replied. “When technology is
expensive, and limited, it’s easy to ensure it doesn’t get into the wrong
hands, that it isn’t used for the wrong purposes. Genetic engineering once
required multimillion dollar labs and was more of an art than a science.
Computers were once bigger than houses. When this was the case, abuse was
virtually nonexistent. A current example is the large hadron collider. We
certainly don’t have to worry about one of these being misused, simply because
there
is
only one of these.”
The professor leaned forward, still seated on the edge of her desk. “But
once technology becomes widespread, inexpensive, commonplace, policing it
becomes impossible. At that point society has to find ways to adapt to all
facets of the technology, good and bad.”
She paused. “So maybe it’s time for the good guys to get ahead of the curve.
When scientists were perfecting the computer, they never imagined that others
would invent computer viruses, or that their new invention would one day be
susceptible to hacking. They never guessed the computer would one day be
exploited to steal money and identities, organize hate-groups, or sabotage
centrifuges.
“But we’ve been through enough transformative technology revolutions now to
know better. We don’t have the luxury of pretending bad actors won’t find a way
to exploit new technologies. We can’t stick our heads in the sand when it comes
to future viruses, as it were. So while all of you, no doubt, are intent on
inventing the computer, I’d urge you to at least consider pursuing research
intended to
counter
the computer
viruses sure to arise. Someone needs to do this. And if we’re ahead of the
curve this time, perhaps we can minimize the damage. This will be a recurring
theme during this and the next session, as we go one by one through the broad domains
that neuroscience will be able to impact in a profound way. We’ll look at both the
limitless promise of our coming capabilities, and their potential to cause us great
peril.”
Rachel hopped off the side of the desk and to her feet, standing in front of
her class now for the first time. “So is it
ever
ethical to induce amnesia?” she asked.
“Yes,” said Sherry Dixon without hesitation. “After traumatic events. The
memories of these events can be debilitating. Their precise excision could be a
cure for these people.”
“While it’s less clear,” added
Deb Sorensen, “those suffering from various phobias might also benefit from
precision removal of certain memories.”
“Both excellent points,” said
Rachel, knowing that research had been ongoing to accomplish both of these
goals for many years, but not wanting to sidetrack the discussion.
“It’s almost too bad there are positive benefits from this research,” said
Greg Feldman. “Because this gives bad actors an excuse to pursue the
technology.”
Regev shook his head. “That may be true,” he said, “but like the professor mentioned,
when research tools become as cheap and accessible as neuroscience tools are
about to become, bad actors won’t need an excuse, or permission, to work on
whatever they want.”
“Yes, but like the professor also pointed out,” said Shaw, “this should give
other scientists the motivation to counter this threat. To work on detecting
when such memory loss drugs or techniques have been used against a person’s
will.”
Regev looked unimpressed. “Maybe,” he said. “But while knowing that this has
happened is great,” he pointed out, “it won’t restore the memories that have
been erased.”
Rachel was quite pleased. She could practically feel the juices flowing in
this group of students now, and these sessions would serve their purpose—get
the next generation of neuroscientists out of their comfort zone. Get them to
think holistically about what they were doing, and the implications, both good
and bad.
“Eyal is correct,” she said. “But so what? This just means there is another
great research project waiting out there. The restoration of memories
after
they’ve been erased. A one-two
punch. Detect when this technology has been used illegally, without permission,
and then restore the victim’s memories when it has. Why not? Computer
scientists are able to restore data from hard drives that have been wiped.”
“Or better still,” said Regev, “find a way for the brain to block such
attempts. A prophylactic drug that immunizes people against malicious memory
attacks.”
Rachel nodded at the Israeli. “Outstanding,” she said in delight. “That
would be even better.”
“On the other hand,” said Sherry Dixon, “I watch a lot of thrillers, and
being able to permanently delete memory would save lives in certain
circumstances. Suppose you’re a villain, and someone stumbles on your plan, or
your actual crime. Today, you’d have to kill them to keep them quiet. But if
memory erasure were perfected, you could just surgically remove the offending
memories, and no one would be the wiser.”
Rachel grinned. “Okay, then. An upside that would only help a few people, in
very specific circumstances, but an upside nonetheless. So no need to kill the
guy informing on you, who is about to go into the witness protection program.
Just make him forget what he knows.”
Eyal Regev put on a pained expression. “No offense, Professor How—Rachel,”
he said in amusement, “but you should stick with your day job. You’d make a
lousy criminal kingpin. I mean, the guy was going to rat you out. So you’d
still have to kill him.” He flashed an impish smile. “You know, to send a
message.”
Rachel laughed out loud. “Remind me not to get on your bad side, Eyal,” she
said, raising her eyebrows.
The class discussed induced
memory loss for another ten minutes or so when Rachel decided to move on.
“Let’s turn to another
application that is near and dear to my heart,” she began. “Learning. Are all
of you familiar with the movie
The Matrix
?”
The movie was dated, but also
somewhat iconic, and Rachel found that most of the time it was known to all of
her students. She soon learned that this class was no exception.
“In my view,” said Rachel, “the
most extraordinary line in the entire movie was, ‘
I know kung fu
.’ Remember that one? They zap knowledge of multiple
martial arts straight into the character Neo’s brain. One second he doesn’t
know how to fight,” she continued, “and the next he’s a black belt in just about
every fighting discipline ever devised. Just like loading a program into a
computer. Keanu Reeves, who plays Neo, says this line in equal parts shock and
delight, as he realizes the knowledge of decades of intense training has just
appeared magically in his mind.”
“So I’m guessing your second
favorite scene was on the roof with the helicopter?” said Regev.
“Good guess,” she said
appreciatively. “It is indeed. For those of you who don’t remember it, Trinity
and Neo are on a roof after being attacked. Neo sees a military helicopter that
is parked there. He turns to Trinity and says, ‘Can you fly that thing?’ And
Trinity’s
answer? Not yes or no, as we
might expect, but simply, ‘Not yet.’”
Rachel paused to give her class
time to bring the memory of this scene to the surface. “I mean, it’s the
perfect
response,” she said
enthusiastically. “Because Trinity immediately has this knowledge zapped into
her brain. Less than a minute later she’s an elite pilot, and away they go.”
“It was truly an awesome scene,”
agreed Feldman, and the other five students nodded their heartfelt agreement.
“At first blush, it’s not
difficult to argue that this would be about the coolest capability
ever
,” said Rachel, almost glowing. “And
what if you didn’t need to be in a matrix or have a jack in your skull? What if
you could find less invasive ways to pull this off? But with the same end
result: instant education. And not just education, not just superficial
knowledge, but deep knowledge. Expertise.”
She paused, telling herself she needed
to throttle back on her enthusiasm to ensure an unbiased conversation. This was
more difficult than she realized, as she had been working toward the goal of
instant education for many years. And while she didn’t advertise this research,
and had yet to publish most of her findings, there were a number of colleagues
who knew what she was trying to do, and it wouldn’t surprise her if one or more
of these students were aware, as well.
“Some progress toward this goal
has already been made,” she noted. “Enough to be fairly sure this will someday
be achievable, especially now that the full map of the brain is becoming
available. With luck, we’ll be able to pull this off in five to fifteen years. So
what are your thoughts about this?” she asked, with just the hint of a smile.
“It would be incredible!” said
Sanjeev Shah in awe, and beside him, Sherry Dixon nodded her enthusiastic
agreement. “The entire population of the world could be highly educated.
Illiteracy would be wiped from the map.”
“So does everyone agree that this
would be a good thing?” asked the professor.
There was an immediate and
unanimous chorus of agreement.
“Okay,” said Rachel. “But there
must be a downside, right? I mean,
everything
has a downside. Even a pool of the richest milk chocolate can be used to drown
someone.”
This example brought smiles to
the faces of several students.
“Let me start us off with one
possibility,” said the professor. “I read a science fiction story when I was a
kid. One that was many decades old even then. The Earth had been traveling
through a vast region of space that, unbeknownst to us, contained some kind of
invisible electromagnetic dampening field. One day the solar system and Earth
finally move beyond this great cloud, and every man, woman, and child became brilliant
overnight.
Beyond
brilliant. Everyone
could educate themselves as fast as they could speed read. But one effect of
this was that no one was willing to do menial labor. Factories stopped working.
So if what I call
Matrix Learning
were perfected, isn’t this one possible downside?”
“Yes,” said Michele, “but this
would only be temporary. Society would adjust. All of these knowledgeable people
would come up with ways to eliminate menial jobs. They could create a
paradise.”
Rachel nodded. “I tend to agree,
but it’s something to tuck into the backs of our minds. So any other downsides?
Is this instant education fair? You spend decades of your life, thousands and
thousands of hours, struggling to cram a world-class knowledge of neuroscience
into your brains. During grad school, while your friends are at the beach,
you’re doing all-nighter after all-nighter. Later, you work your ass off
keeping up with the latest experimental literature, slaving endless hours in
the lab.”
She paused. “And then Matrix
Learning is invented, and a day later, the guy who spent the last decade
getting high and playing video games can have your same knowledge. Effortlessly.
Is that fair?”
This question brought deep
frowns all around.
“Yeah, that would kind of suck,”
said Greg Feldman.
“So in some ways,” said Regev,
“it would be in the best interest of those who are now highly educated to
prevent this technology from happening. To protect their turf, their
advantage.”
“And the multibillion dollar
educational system would become extinct overnight,” added Sherry Dixon. “Hundreds
of thousands of teachers across America would be out of work. The university
system, down the drain.”
“Maybe not,” said Rachel. “At
least not at the grade school and high school level. Maybe you’d want to limit
this technology to those who are eighteen and older. Maybe you’d want to leave
the current grade school and high school educational system intact. Why?
Because people still need to learn how to learn. If they don’t know how to learn
for themselves, they won’t be able to wield the knowledge that is zapped into
their heads, adapt it for other applications, expand on this knowledge.”
Rachel paused to let this sink
in. “We have calculators that can do long division, but we still force our kids
to do it the hard way. Why? So they understand the concepts behind the results.
So they have the proper background for further studies.”
Regev shook his head. “I hate to
always be the skeptic,” he said, “but banning this technology for minors will
only work if you can detect when this Matrix Learning has been done. Or if
preventative measures are put in place. A molecular chastity belt for the mind,
that advertises when it’s been violated. If not, kids will take the short cut.
They can’t use a calculator in class because a teacher would see it. But if
they could have high school chemistry implanted into their minds, a teacher
would never know it. Cheating would become rampant, and not just among kids.
Many parents would be actively involved, willing to do anything to help their
kids get ahead.”