Freudianism and Marxism—and with them, the entire belief in social conditioning—were demolished so swiftly, so suddenly, that neuroscience
has surged in, as if into an intellectual vacuum. Nor do you have to be a scientist to detect the rush.
Anyone with a child in school knows the signs all too well. I am intrigued by the faith parents now invest—the craze began about 1990—in psychologists who diagnose their children as suffering from a defect known as attention deficit disorder, or ADD. Of course, I have no way of knowing whether this “disorder” is an actual, physical, neurological condition or not, but neither does anybody else in this early stage of neuroscience. The symptoms of this supposed malady are always the same. The child or, rather, the boy—forty-nine out of fifty cases are boys—fidgets around in school, slides off his chair, doesn’t pay attention, distracts his classmates during class, and performs poorly. In an earlier era he would have been pressured to pay attention, work harder, show some self-discipline. To parents caught up in the new intellectual climate of the 1990s, that approach seems cruel, because my little boy’s problem is …
he’s wired wrong!
The poor little tyke—
the fix has been in since birth!
Invariably the parents complain, “All he wants to do is sit in front of the television set and watch cartoons and play Sega Genesis.” For how long? “How long? For hours at a time.” Hours at a time; as even any young neuroscientist will tell you, that boy may have a problem, but it is not an attention deficit.
Nevertheless, all across America we have the spectacle of an entire generation of little boys, by the tens of thousands, being closed up on ADD’s magic bullet of choice, Ritalin, the CIBA—Geneva Corporation’s brand name for the stimulant methylphenidate. I first encountered Ritalin in 1966, when I was in San Francisco doing research for a book on the psychedelic or hippie movement. A certain species of the genus hippie was known as the Speed Freak, and a certain strain of Speed Freak was known as the Ritalin Head. The Ritalin Heads loved Ritalin. You’d see them in the throes of absolute Ritalin raptures … Not a wiggle, not a peep … They would sit engrossed in
anything at all
… a manhole cover, their own palm wrinkles … indefinitely . . through shoulda-been mealtime after mealtime … through raging insomnias … Pure methylphenidate nirvana … From 1990 to 1995, CIBA-Geneva’s sales of Ritalin rose 600 percent; and not because of the appetites of subsets of the species Speed Freak in San Francisco, either. It was because an entire generation of American boys, from the best private schools of the Northeast to the worst sludge-trap public schools of Los Angeles and San Diego, was now strung out on methylphenidate, diligently doled out to them every day by their connection, the school nurse. America is a wonderful country! I mean it! No honest writer would challenge that statement! The human comedy never runs out of material! It never lets you down!
Meantime, the notion of a self—a self who exercises self-discipline, postpones gratification, curbs the sexual appetite, stops short of aggression and criminal behavior—a self who can become more intelligent and lift itself to the very peaks of life by its own bootstraps through study, practice, perseverance, and refusal to give up in the face of great odds—this old-fashioned notion (what’s a bootstrap, for God’s sake?) of success through enterprise and true grit is already slipping away, slipping away … slipping away … The peculiarly American faith in the power of the individual to transform himself from a helpless cypher into a giant among men, a faith that ran from Emerson (“Self-Reliance”) to Horatio Alger’s
Luck and Pluck
stories to Dale Carnegie’s
How to Win Friends and Influence People
to Norman Vincent Peale’s
The Power of Positive Thinking
to Og Mandino’s
The Greatest Salesman in the World
—that faith is now as moribund as the god for whom Nietzsche wrote an obituary in 1882. It lives on today only in the decrepit form of the “motivational talk,” as lecture agents refer to it, given by retired football stars such as Fran Tarkenton to audiences of businessmen, most of them woulda-been athletes (like the author of this article), about how life is like a football game. “It’s late in the fourth period and you’re down by thirteen points and the Cowboys got you hemmed in on your own one-yard line and it’s third and twenty-three. Whaddaya do? …”
Sorry, Fran, but it’s third and twenty-three and the genetic fix is in, and the new message is now being pumped out into the popular press and onto television at a stupefying rate. Who are the pumps? They are
a new breed who call themselves “evolutionary psychologists.” You can be sure that twenty years ago the same people would have been calling themselves Freudian; but today they are genetic determinists, and the press has a voracious appetite for whatever they come up with.
The most popular study currently—it is
still
being featured on television news shows—is David Lykken and Auke Tellegen’s study at the University of Minnesota of two thousand twins that shows, according to these two evolutionary psychologists, that an individual’s happiness is largely genetic. Some people are hardwired to be happy and some are not. Success (or failure) in matters of love, money, reputation, or power is transient stuff; you soon settle back down (or up) to the level of happiness you were born with genetically.
Fortune
devoted a long takeout, elaborately illustrated, of a study by evolutionary psychologists at Britain’s University of Saint Andrews showing that you judge the facial beauty or handsomeness of people you meet not by any social standards of the age you live in but by criteria hardwired in your brain from the moment you were born. Or, to put it another way, beauty is not in the eye of the beholder but embedded in his genes. In fact, today, in the year 2000, if your appetite for newspapers, magazines, and television is big enough, you will quickly get the impression that there is nothing in your life, including the fat content of your body, that is not genetically predetermined. If I may mention just a few things the evolutionary psychologists have illuminated for me recently:
One widely publicized study found that women are attracted to rich or powerful men because they are genetically hardwired to sense that alpha males will be able to take better care of their offspring. So if her current husband catches her with somebody better than he is, she can say in all sincerity, “I’m just a lifeguard in the gene pool, honey.” Personally, I find that reassuring. I used to be a cynic. I thought the reason so many beautiful women married ugly rich men was that they were schemers, connivers, golddiggers. Another study found that the male of the human species is genetically hardwired to be polygamous, i.e., unfaithful to his legal mate, so that he will cast his seed as widely as humanly possible. Well … men can read, too! “Don’t blame me, honey.
Four hundred thousand years of evolution made me do it.” Another study showed that most murders are the result of genetically hardwired compulsions. Well … convicts can read, too, and hoping for parole, they report to the prison psychiatrist: “Something came over me … and then the knife went in.”
3
Another showed that teenage girls, being in the prime of their fecundity, are genetically hardwired to be promiscuous and are as helpless to stop themselves as minks or rabbits. Some public school systems haven’t had to be told twice. They provide not only condoms but also special elementary, junior high, and high schools where teenage mothers can park their offspring in nursery rooms while they learn to read print and do sums.
Where does that leave “self-control”? In quotation marks, like many old-fashioned notions—once people believe that this ghost in the machine, “the self,” does not even exist and brain imaging proves it, once and for all.
So far, neuroscientific theory is based largely on indirect evidence, from studies of animals or of how a normal brain changes when it is invaded (by accidents, disease, radical surgery, or experimental needles). Darwin II himself, Edward O. Wilson, has only limited direct knowledge of the human brain. He is a zoologist, not a neurologist, and his theories are extrapolations from the exhaustive work he has done in his specialty, the study of insects. The French surgeon Paul Broca discovered Broca’s area, one of the two speech centers of the left hemisphere of the brain, only after one of his patients suffered a stroke. Even the PET scan and the PET reporter gene/PET reporter probe are technically medical invasions, since they require the injection of chemicals or viruses into the body. But they offer glimpses of what the noninvasive imaging of the future will probably look like. A neuroradiologist can read a list of topics out loud to a person being given a PET scan, topics pertaining to sports, music, business, history, whatever, and when he finally hits one the person is interested in, a particular area of the
cerebral cortex actually lights up on the screen. Eventually, as brain imaging is refined, the picture may become as clear and complete as those see-through exhibitions, at auto shows, of the inner workings of the internal combustion engine. At that point it may become obvious to everyone that all we are looking at is a piece of machinery, an analog chemical computer, that processes information from the environment. “All,” since you can look and look and you will not find any ghostly self inside, or any mind, or any soul.
Thereupon, in the year 2010 or 2030, some new Nietzsche will step forward to announce: “The self is dead”—except that being prone to the poetic, like Nietzsche the First, he will probably say: “The soul is dead.” He will say that he is merely bringing the news, the news of the greatest event of the millennium: “The soul, that last refuge of values, is dead, because educated people no longer believe it exists.” Unless the assurances of the Wilsons and the Dennetts and the Dawkinses also start rippling out, the madhouse that will ensue may make the phrase “the total eclipse of all values” seem tame.
If I were a college student today, I don’t think I could resist going into neuroscience. Here we have the two most fascinating riddles of the twenty-first century: the riddle of the human mind and the riddle of what happens to the human mind when it comes to know itself absolutely. In any case, we live in an age in which it is impossible and pointless to avert your eyes from the truth.
Ironically, said Nietzsche, this unflinching eye for truth, this zest for skepticism, is the legacy of Christianity (for complicated reasons that needn’t detain us here). Then he added one final and perhaps ultimate piece of irony in a fragmentary passage in a notebook shortly before he lost his mind (to the late nineteenth century’s great venereal scourge, syphilis). He predicted that eventually modern science would turn its juggernaut of skepticism upon itself, question the validity of its own foundations, tear them apart, and self-destruct. I thought about that in the summer of 1994, when a group of mathematicians and computer
scientists held a conference at the Santa Fe Institute on “Limits to Scientific Knowledge.” The consensus was that since the human mind is, after all, an entirely physical apparatus, a form of computer, the product of a particular genetic history, it is finite in its capabilities. Being finite, hardwired, it will probably never have the power to comprehend human existence in any complete way. It would be as if a group of dogs were to call a conference to try to understand The Dog. They could try as hard as they wanted, but they wouldn’t get very far. Dogs can communicate only about forty notions, all of them primitive, and they can’t record anything. The project would be doomed from the start. The human brain is far superior to the dog’s, but it is limited nonetheless. So any hope of human beings arriving at some final, complete, self-enclosed theory of human existence is doomed, too.
This, science’s Ultimate Skepticism, has been spreading ever since then. Over the past two years even Darwinism, a sacred tenet among American scientists for the past seventy years, has been beset by … doubts. Scientists—not religiosi—notably the mathematician David Berlinski (“The Deniable Darwin,”
Commentary
, June 1996) and the biochemist Michael Behe (
Darwin’s Black Box
, 1996) have begun attacking Darwinism as a mere theory, not a scientific discovery, a theory woefully unsupported by fossil evidence and featuring, at the core of its logic, sheer mush. (Dennett and Dawkins, for whom Darwin is the Only Begotten, the Messiah, are already screaming. They’re beside themselves, utterly apoplectic. Wilson, the giant, keeping his cool, has remained above the battle.) Noam Chomsky has made things worse by pointing out that there is nothing even in the highest apes remotely comparable to human speech, which is in turn the basis of recorded memory and, therefore, everything from skyscrapers and missions to the moon to Islam and little matters such as the theory of evolution. He says it’s not that there is a missing link; there is nothing to link up
with
. By 1990 the physicist Petr Beckmann of the University of Colorado had already begun going after Einstein. He greatly admired Einstein for his famous equation of matter and energy, E=mc
2
, but called his theory of relativity mostly absurd and grotesquely untestable. Beckmann died in
1993. His Fool Killer’s cudgel has been taken up by Howard Hayden of the University of Connecticut, who has many admirers among the upcoming generation of Ultimately Skeptical young physicists. The scorn the new breed heaps upon quantum mechanics (“has no real-world applications” … “depends entirely on goofball equations”), Unified Field Theory (“Nobel worm bait”), and the Big Bang Theory (“creationism for nerds”) has become withering. If only Nietzsche were alive! He would have relished every minute of it!
Recently I happened to be talking to a prominent California geologist, and she told me: “When I first went into geology, we all thought that in science you create a solid layer of findings, through experiment and careful investigation, and then you add a second layer, like a second layer of bricks, all very carefully, and so on. Occasionally some adventurous scientist stacks the bricks up in towers, and these towers turn out to be insubstantial and they get torn down, and you proceed again with the careful layers. But we now realize that the very first layers aren’t even resting on solid ground. They are balanced on bubbles, on concepts that are full of air, and those bubbles are being burst today, one after the other.”