Cessini raised his hand, and then spoke. “My dad said he’s going to take me and that girl sitting down there to the fair this summer to win stuffed animals and eat some cotton candy,” he said. Meg glanced up, but then hunched back down over her game. “I let her play with my tablet. My dad made it for me.”
Daniel rested his hand on Cessini’s arm.
“Interesting, young man,” the host said. “But we’ll get to you in a minute.” Then the host snapped his fingers and pointed. “Wait a minute, Professor. You dodged my question. Free will or fate?”
“There is no free will to choose,” the professor said. “Our fundamental behaviors guide us whether we realize it or not. We’re pre-programmed over thousands, millions of years.”
“It’s far more recent than that—” Robin said as the man next to her sprang forward first.
“I am convinced—” the man interrupted.
“Reverend?” the host allowed.
“I am convinced societal fear is due to the decay of religious unity,” the reverend said. “And I wholeheartedly disagree with Andy’s evolutionary hypothesis. Individuals who practice their faith most adherently are those least afraid of dying. But an entire generation of technology is taking us further away from our faith. And without faith, we have more fear. More fear raises feelings of loss of control, and with that, the loss of free will. But ironically, this loss of free will one feels, in the absence of religion, more than reinforces my theological view that a higher being is the one in control. Free will belongs only to God.”
“If you’re suggesting we shut off the flow of technology and everything will be fine,” Robin said, “then that’s a dream that simply won’t work. Speaking both as a mother and cognitive neuroscientist, I can tell you molecular processes are directly linked to behavior. We don’t have to go back thousands or millions of years. It’s happening now. I’ve studied fear-conditioning behaviors in the real world, the processes that bring about long-term potentiation, or LTP. LTP is the induction of synaptic plasticity by the electrical or chemical stimulation of the lateral amygdala neural circuits.”
“Whoa, you just blew my lateral amygdala circuits,” the host said. “Anyone ever tell you that you talk like a computer?”
“I’ll vouch for that,” Daniel said, leaning in.
“To translate,” Robin said, “Fear is learned.”
“A chicken pecks on a kernel of corn and gets shocked, so it decides to eat lettuce instead,” the host said.
“Exactly. It’s called Hebbian synaptic plasticity. For modeling, in the lab, we’ve already completed large-mammal brain emulation. Soon, we’ll be announcing the completed scan and modeling of a human brain in its entirety, beneath the connectome’s one hundred trillion neuron pathways, to the level of the synaptome. We’ll scan every property down to the individual receptors and small molecules in the synapse, every signal state, including phosphorylation and methylation of the proteins. Given that model, we’ll be able to measure the very subjects of our discussion: free will, fate, and fear. But for now, no, it’s not free will. It’s chemical fate.”
Cessini swiveled to find what Daniel was looking for on the ceiling. The lights, the tracks? There was nothing different up there, but then, Daniel’s eyes were actually closed, and he was grinning, listening. When Daniel opened his eyes, he looked happily at Robin. She shifted in her chair and ran her fingers across the top of her ear to tuck back her hair. Pressed tight to her lobe was a tiny red earring in the shape of a key. She didn’t wear much jewelry but it made her look really pretty.
“Back in grad school, I studied the problem with imagination,” Robin said. “How the mind goes immobile in the face of a constant reminder of death. DigiSci was searching for the body’s longevity switch, and found the mind’s counterpart instead, a death switch, if you will, a trigger. The hypothesis was, by activating that trigger, the person would enliven with a sense of imperative, a ‘live now’ mentality to enjoy life. But we observed the opposite effect. It depressed the hell out of the mice,” she said with a laugh.
“It’s nice to see you have a sense of humor,” the host said.
“I do,” she said. “The mice seemed tormented knowing death was imminent, even suicidal to get it over with. Dizziness, hallucination. And thankfully, DigiSci abandoned that line of research. But some of the early, most promising concepts were refined and repurposed into the development of the early VaXin series of sprays—” She stopped and squinted, put her hand to her forehead. “No. Actually, no. I’m mistaken.”
“I don’t follow,” the host said. “Which sprays?”
Robin put her hands down into her lap and lowered her head. She spoke again, but more reserved. “I agree there’s a definite fear of loss of control. A loss of control to government, to technology, to corporate intrusions. So, laws are passed to lessen the impact. One of which is all sims and chatbots must self-identify when initiating a session or are directly questioned.”
A digital stamp appeared and rotated at the bottom right corner of the screen for the HACM Lab US at the University of Washington in Seattle, sister lab to the Human & Cognitive Machines lab, HACM Lab AU in Tasmania.
“Snubbing that law only promotes further unease,” Robin said. “People like to know they’re not being interrogated and with whom their ideas are being shared. They like to think they have a choice, even though they might not.”
“Understood,” the host said.
“Reverend, you’ll appreciate this,” Daniel said. “My father gave me the name Daniel after the man who was called to interpret the dreams of the king.”
“I do appreciate the reference,” the reverend said, “but who’s the king in your analogy?”
“I don’t think I’ve met him yet,” Daniel said. “But you never know. Fate works in mysterious ways.”
“Daniel, by way of introduction, you’re here tonight as an invited guest of Robin Blackwell, alumni of the university,” the host said.
“Thank you. Robin was kind enough to invite me and my son as guests of your panel, thinking we might have a unique perspective to offer.”
“Thanks to the hand of fate just a few years ago,” Robin said, “our paths crossed over our children. We met at a doctor’s appointment. If one of us wasn’t early or late, we might never have met.”
“It was me. I was late,” Daniel said.
“Congratulations,” the host said. “It looks like fate is the unanimous winner so far.”
“No, just a minute,” Daniel said. “I think the most direct answer to your question on why you missed your plane is algorithms. It’s not because of what happened millions or thousands of years ago or even what we’ve learned in our lifetimes. It’s because the algorithms in our brain are processing far faster than we are even aware. All possible decisions are pre-calculated in the microseconds before your body responds, or you even know why you’ve made such a decision in the first place. Your brain decided on a 0 or a 1 before you even know why you picked a door on the left versus a door on the right. Understand the 0s and 1s of the brain, and you can play the mind like the keys of a piano. Tune an off key. Replay entire days. Reduce the spikes that are too painful to bear. So given the advantage of microsecond speed, I’d have to say fate is the winner. That is, according to my first attempt with the question.”
“I’ve reviewed your bio,” the host said to Daniel. “You have no formal education. Bringing you here to this stage, Robin is the biggest success you’ve had to date, is she not?”
“I am self-taught, yes,” Daniel said, “and you’re pretty transparent, you know? You’re mixing your context references. But to answer your question, yes, Robin’s invitation ‘to date,’ as in bringing me here with her to speak, is my biggest success.”
The host ticked a smile. “Maybe you mixed your references. Being that this is your biggest success ‘to date,’ as in ‘up until now.’ So, tell me in your own words, why are you here?”
“Did you know he still listens to really old music,” Cessini said. “And he still makes his own parts by hand sometimes instead of printing them.”
“I’m nostalgic for the old days,” Daniel said as Robin snickered. “But I’m learning. Teaching myself to code. I’ve got some great ideas on a new kind of test, I think. An inversion test.” Then he glanced up at the host on the screen. “You’d love it. I’m also thinking of self-publishing a paper on algorithm compression, maybe kernelling. I don’t have that fully fleshed out yet, but I think it could transform robotics. I don’t know where it’ll take me. But as a dad, I interpret fate and free will every day.” He nodded like he figured out something great. “So, I guess you could say by interpreting Cessini’s world that, yeah, maybe that makes him the king.”
“That’s a wonderful segue,” the host said. “And as a father myself, so you don’t think I’m a complete faux pas, I have a founders’ relationship with the prestigious
Journal of Advanced Design and Computational Dynamics for Intelligent Systems
. If one of your papers pan out, submit it to me. I’ll see what I can do.”
“Thank you,” Daniel said, humbled. The host seemed sincere.
“You’re very welcome. So, introduce your son so he can grant us his unique perspective on our topic of free will versus fate.”
Daniel leaned forward with his elbows up on the table. He pinpointed his focus with his fingers scratching his brows. “Throughout all of history, we have had a symbiotic relationship with water. In order to live in this world, one must learn not to be reactive to water. Seventy percent of our bodies are made of water. Technology is ubiquitous, like water. Since we don’t genetically fear water, we shouldn’t genetically fear technology.”
“And I don’t,” Cessini said.
“Don’t what?” the host asked.
“Fear technology,” Cessini said. “I like it. But I’m reactive to water. Genetically, we think.
Aquagenic urticaria
.”
“Precisely,” Daniel said as he mapped out mental notes on the table with his hands. “By evolution, Professor, he shouldn’t be reactive to water, but he is. By psychology, Robin, he shouldn’t be conditioned to like water, but he does. Maybe that learned fear will come next, Reverend. We’re all hoping not. But for now, he’s moving forward, not fearful. I even just got him a wave machine that he wanted, and made a sort of bellows lamp in the shape of a squid to go with it that he absolutely loves.”
“Your reactivity to water is not your imagination poking beneath the surface?” the host asked Cessini. “A fear of water induced by some previous event?”
Cessini looked up and shook his head. Daniel said, “No.”
The host stopped short of a follow-up question as a pixelated logo of
“DNWR,”
appeared at the lower right corner of his podium screen. “This has turned into quite the serious discussion,” the host said. “And what are you going to be when you grow up?”
“I dunno,” Cessini said as he swiveled in his chair, then said with bright, lucid eyes, “Maybe I’ll walk on the sky and save everybody before a giant spaceship comes crashing,
plhssss
, and explodes all over the planet.” He bumped and crashed his fist across the desk.
“Like a superhuman?” the host asked.
“Yeah.” Cessini grinned with a bob of his head. “Something like that. My dad and I can make anything happen. I’m also going to invent a fireman’s hose without water.”
“That already exists as foam,” the host said.
“But mine’s going to shoot nano-tech cells that catch and slurp up the fire back into the hose.”
Daniel folded his hands on the table as he leaned in toward the host. “Technology doesn’t kill the imagination of children. It lets it fly. We might not have free will according to all of our answers so far. But I can only talk about what I know. We’re here in the now, and technology is here to stay, whether we like it or not. Robin is right. There is no source you can just go and shut off. And this gets to the point of exactly why we’re here.”
Daniel closed his eyes and all waited in silence. “Everyone has a digital life assistant in the dash of their car, the devices they carry, the particles in the clothes they wear. All this frees the human mind to run with its own ambition and dreams. Human potential ignites with its own power, and that power feeds back around for the creation of more and greater computer code. The unseen, wonderful beauty of code. Think about it. It’s beautiful, magical, structured, and clean. I go to sleep dreaming about an unseen world of colors and codes. I wake up falling in love. If that dream of true emotion is by any hope some small measure of my doing, then I declare my second answer to your question firmly on the side of free will. My freedom to choose. And for that, I know, Cessini will be right at my side.”
Cessini swiveled in his interview chair, proud as a son could be.
The host turned again to Cessini. “Well, young man, you stick with a father like yours, and I have no fear you’ll get everything your heart desires,” he said. “But unfortunately for me, my fear is that when you get older, you’ll have forgotten all about this interview and the quality time you’ve spent here with me.”
Cessini shrugged his shoulders and spun his chair by force of a single hand on the desk. “No, I won’t.” He pushed for a faster twirl. “We only forget the bad and remember the good.”
“Well, thank you,” the host said. “I’ll take that to mean you think I’m a good host.”
“Yeah, I do,” Cessini said as he skidded two hands on the desk and smiled back as he stopped. “I know I’ll remember this time.”
“The long-term effects of a lot of things are unknown,” Daniel said. “Technology, good or bad, can unleash the power of the mind or take us down the next evolutionary step of fear. We need to choose wisely. The world needs a canary, don’t you think? To keep us on the right path. Long term . . . my bet is on Cessini and the triumph of free will. But then again, it’s a fight with whatever strength you put into it. And as already said, fate has a pretty big head start.”
Daniel winked at Cessini adoringly, a connection only they two could share. No camera could have caught it. A memory imprinted on a soul. A packet. A moment he could remember.
“Well, Daniel,” the host said, “it appears you may very well have found your king.”
“He’s my protégé. My number one engineer.”