He actually said as much a quarter of a century ago in the opening paragraph of
Sociobiology
’s incendiary Chapter 27. The humanities and social sciences, he said, would “shrink to specialized branches of biology.” Such venerable genres as history, biography, and the novel would become “the research protocols,” i.e., preliminary reports of the study of human evolution. Anthropology and sociology would disappear as separate disciplines and be subsumed by “the sociobiology of a single primate species,”
Homo sapiens
. There was so much else in Chapter 27 to outrage the conventional wisdom of the Goulds and the Lewontins of the academic world that they didn’t pay much attention to this convergence of all human disciplines and literary pursuits.
But in 1998 Wilson spelled it out at length and so clearly that no one inside or outside of academia could fail to get the point. He published an entire book on the subject,
Consilience,
which immediately became a bestseller despite the theoretical nature of the material. The term “consilience” was an obsolete word referring to the coming together, the confluence, of different branches of knowledge.
The ruckus
Consilience
kicked up spread far beyond the fields of biology and evolutionism.
Consilience
was a stick in the eye of every novelist, every historian, every biographer, every social scientist—every intellectual of any stripe, come to think of it. They were all about to be downsized, if not terminated, in a vast intellectual merger. The counterattack began. Jeremy Bernstein, writing in
Commentary,
drew first blood with a review titled “E. O. Wilson’s Theory of Everything.” It began: “It is not uncommon for people approaching the outer shores of middle age to go slightly dotty.” Oh Lord, another theory of everything from the dotty professor. This became an intellectual drumbeat—“just another theory of everything”—and Wilson saw himself tried and hanged on a charge of hubris.
As for me, despite the prospect of becoming a mere research protocol drudge for evolutionism, I am willing to wait for the evidence. I am skeptical, but like Wilson, I am willing to wait. If Wilson is right, what interests me is not so much what happens when all knowledge flows together as what people will do with it once every nanometer and every action and reaction of the human brain has been calibrated and made manifest in predictable statistical formulas. I can’t help thinking of our children of the dawn, the art students we last saw in the Suntory Museum, Osaka, Japan. Not only will they be able to morph illustrations on the digital computer, they will also be able to predict, with breathtaking accuracy, the effect that certain types of illustrations will have on certain types of brains. But, of course, the illustrators’ targets will be able to dial up the same formulas and information and diagnose the effect that any illustration, any commercial, any speech, any flirtation, any bill, any coo has been crafted to produce. Life will become one incessant, colossal round of the
match game or liar’s poker or one-finger-two-finger or rock-paperscissors.
Something tells me, mere research protocol drudge though I may be, that I will love it all, cherish it, press it to my bosom. For I already have a working title,
The Human Comedy,
and I promise you, you will laugh your head off … your head and that damnable, unfathomable chemical analog computer inside of it, too.
B
eing a bit behind the curve, I had only just heard of the digital revolution when Louis Rossetto, co-founder of Wired 1 magazine, wearing a shirt with no collar and his hair as long as Felix Mendelssohn’s, looking every inch the young California visionary, gave a speech before the Cato Institute announcing the dawn of the twenty-first century’s digital civilization. As his text, he chose Teilhard de Chardin’s prediction fifty years ago that radio, television, and computers would create a “noosphere,” an electronic membrane covering the earth and wiring all humanity together in a single nervous system. Geographic locations, national boundaries, the old notions of markets and political processes—all would become irrelevant. With the Internet spreading over the globe at an astonishing pace, said Rossetto, that marvelous modem-driven moment is almost at hand.
Could be. But something tells me that within ten years, by 2010, the entire digital universe is going to seem like pretty mundane stuff compared to a new technology that right now is but a mere glow radiating
from a tiny number of American and Cuban (yes, Cuban) hospitals and laboratories. It is called brain imaging, and anyone who cares to get up early and catch a truly blinding twenty-first-century dawn will want to keep an eye on it.
Brain imaging refers to techniques for watching the human brain as it functions, in real time. The most advanced forms currently are three-dimensional electroencephalography using mathematical models; the more familiar PET scan (positron-emission tomography); the new fMRI (functional magnetic resonance imaging), which shows brain bloodflow patterns, and MRS (magnetic resonance spectroscopy), which measures biochemical changes in the brain; and the even newer PET reporter gene/PET reporter probe, which is, in fact, so new that it still has that length of heavy lumber for a name. Used so far only in animals and a few desperately sick children, the PET reporter gene/PET reporter probe pinpoints and follows the activity of specific genes. On a scanner screen you can actually see the genes light up inside the brain.
By the standards of the year 2000, these are sophisticated devices. Ten years from now, however, they may seem primitive compared to the stunning new windows into the brain that will have been developed.
Brain imaging was invented for medical diagnosis. But its far greater importance is that it may very well confirm, in ways too precise to be disputed, current neuroscientific theories about “the mind,” “the self,” “the soul,” and “free will.” Granted, all those skeptical quotation marks are enough to put anybody on the
qui vive
right away, but Ultimate Skepticism is part of the brilliance of the dawn I have promised.
Neuroscience, the science of the brain and the central nervous system, is on the threshold of a unified theory that will have an impact as powerful as that of Darwinism a hundred years ago. Already there is a new Darwin, or perhaps I should say an updated Darwin, since no one ever believed more religiously in Darwin the First than does he: Edward O. Wilson.
As we have seen, Wilson has created and named the new field of sociobiology, and he has compressed its underlying premise into a single
sentence. Every human brain, he says, is born not as a blank tablet (a
tabula rasa
) waiting to be filled in by experience but as “an exposed negative waiting to be slipped into developer fluid.” (See page 81, above.) You can develop the negative well or you can develop it poorly, but either way you are going to get precious little that is not already imprinted on the film. The print is the individual’s genetic history, over thousands of years of evolution, and there is not much anybody can do about it. Furthermore, says Wilson, genetics determine not only things such as temperament, role preferences, emotional responses, and levels of aggression but also many of our most revered moral choices, which are not choices at all in any free-will sense but tendencies imprinted in the hypothalamus and limbic regions of the brain, a concept expanded upon in 1993 in a much-talked-about book,
The Moral Sense
, by James Q. Wilson (no kin to Edward O.).
This, the neuroscientific view of life, has become the strategic high ground in the academic world, and the battle for it has already spread well beyond the scientific disciplines and, for that matter, out into the general public. Both liberals and conservatives without a scientific bone in their bodies are busy trying to seize the terrain. The gay rights movement, for example, has fastened onto a study, published in July 1993 by the highly respected Dean Hamer of the National Institutes of Health, announcing the discovery of “the gay gene.” Obviously, if homosexuality is a genetically determined trait, like left-handedness or hazel eyes, then laws and sanctions against it are attempts to legislate against Nature. Conservatives, meantime, have fastened upon studies indicating that men’s and women’s brains are wired so differently, thanks to the long haul of evolution, that feminist attempts to open up traditionally male roles to women are the same thing: a doomed violation of Nature.
Wilson himself has wound up in deep water on this score; or cold water, if one need edit. In his personal life Wilson is a conventional liberal, PC, as the saying goes—he is, after all, a member of the Harvard
faculty—concerned about environmental issues and all the usual things. But he has said that “forcing similar role identities” on both men and women “flies in the face of thousands of years in which mammals demonstrated a strong tendency for sexual division of labor. Since this division of labor is persistent from hunter-gatherer through agricultural and industrial societies, it suggests a genetic origin. We do not know when this trait evolved in human evolution or how resistant it is to the continuing and justified pressures for human rights.”
“Resistant” was Darwin II, the neuroscientist, speaking. “Justified” was the PC Harvard liberal. He was not PC or liberal enough. As we have already seen, protesters invaded the annual meeting of the American Academy for the Advancement of Science, where Wilson was appearing, dumped a pitcher of ice water, cubes and all, over his head, and began chanting, “You’re all wet! You’re all wet!” The most prominent feminist in America, Gloria Steinem, went on television and, in an interview with John Stossel of ABC, insisted that studies of genetic differences between male and female nervous systems should cease forthwith.
But that turned out to be mild stuff in the current political panic over neuroscience. In February 1992, Frederick K. Goodwin, a renowned psychiatrist, head of the federal Alcohol, Drug Abuse, and Mental Health Administration, and a certified yokel in the field of public relations, made the mistake of describing, at a public meeting in Washington, the National Institute of Mental Health’s ten-year-old Violence Initiative. This was an experimental program whose hypothesis was that, as among monkeys in the jungle—Goodwin was noted for his monkey studies—much of the criminal mayhem in the United States was caused by a relatively few young males who were genetically predisposed to it; who were hardwired for violent crime, in short. Out in the jungle, among mankind’s closest animal relatives, the chimpanzees, it seemed that a handful of genetically twisted young males were the ones who committed practically
all
the wanton murders of other males and the physical abuse of females. What if the same were true among human beings? What if, in any given community, it turned out to be a
handful of young males with toxic DNA who were pushing statistics for violent crime up to such high levels? The Violence Initiative envisioned identifying these individuals in childhood, somehow, some way, someday, and treating them therapeutically with drugs. The notion that crime-ridden urban America was a “jungle,” said Goodwin, was perhaps more than just a tired old metaphor.
That did it. That may have been the stupidest single word uttered by an American public official in the year 1992. The outcry was immediate. Senator Edward Kennedy of Massachusetts and Representative John Dingell of Michigan (who, it became obvious later, suffered from hydrophobia when it came to science projects) not only condemned Goodwin’s remarks as racist but also delivered their scientific verdict: Research among primates “is a preposterous basis” for analyzing anything as complex as “the crime and violence that plagues our country today.” (This came as surprising news to NASA scientists who had first trained and sent a chimpanzee called Ham up on top of a Redstone rocket into suborbital space flight and then trained and sent another one, called Enos, which is Greek for “man,” up on an Atlas rocket and around the earth in orbital space flight and had thereby accurately and completely predicted the physical, psychological, and task-motor responses of the human astronauts Alan Shepard and John Glenn, who repeated the chimpanzees’ flights and tasks months later.) The Violence Initiative was compared to Nazi eugenic proposals for the extermination of undesirables. Dingell’s Michigan colleague, Representative John Conyers, then chairman of the Government Operations Committee and senior member of the Congressional Black Caucus, demanded Goodwin’s resignation—and got it two days later, whereupon the government, with the Department of Health and Human Services now doing the talking, denied that the Violence Initiative had ever existed. It disappeared down the memory hole, to use Orwell’s term.
A conference of criminologists and other academics interested in the neuroscientific studies done so far for the Violence Initiative—a conference ttnderwritten in part by a grant from the National Institutes
of Health—had been scheduled for May 1993 at the University of Maryland. Down went the conference, too; the NIH drowned it like a kitten. A University of Maryland legal scholar named David Wasserman tried to reassemble the troops on the Q.T., as it were, in a hall all but hidden from human purview in a hamlet called Queenstown in the foggy, boggy boondocks of Queen Annes County on Maryland’s Eastern Shore. (The Clinton administration tucked Elian Gonzalez away in this same county while waiting for the Cuban-American vote to chill before the Feds handed the boy over to Fidel Castro.) The NIH, proving it was a hard learner, quietly provided $133,000 for the event, but only after Wasserman promised to fireproof the proceedings by also inviting scholars who rejected the notion of a possible genetic genesis of crime and scheduling a cold-shower session dwelling on the evils of the eugenics movement of the early twentieth century. No use, boys! An army of protesters found the poor cringing devils anyway and stormed into the auditorium chanting, “Maryland conference, you can’t hide—we know you’re pushing genocide!” It took two hours for them to get bored enough to leave, and the conference ended in a complete puddle, with the specially recruited fireproofing PC faction issuing a statement that said: “Scientists as well as historians and sociologists must not allow themselves to provide academic respectability for racist pseudoscience.” Today, at the NIH, the term Violence Initiative is a synonym for
taboo
. The present moment resembles that moment in the Middle Ages when the Catholic Church forbade the dissection of human bodies, for fear that what was discovered inside might cast doubt on the Christian doctrine that God created man in his own image.
Even more radioactive is the matter of intelligence, as measured by IQ tests. Privately—not many care to speak out—the vast majority of neuroscientists believe the genetic component of an individual’s intelligence is remarkably high. Your intelligence can be improved upon, by skilled and devoted mentors, or it can be held back by a poor upbringing-i. e., the negative can be well developed or poorly devcloped-but your genes are what really make the difference. The recent ruckus over
Charles Murray and Richard Herrnstein’s
The Be// Curve
is probably just the beginning of the bitterness the subject is going to create.
Not long ago, according to two neuroscientists I interviewed, a firm called Neurometrics sought out investors and tried to market an amazing but simple invention known as the IQ Cap. The idea was to provide a way of testing intelligence that would be free of “cultural bias,” one that would not force anyone to deal with words or concepts that might be familiar to people from one culture but not to people from another. The IQ Cap recorded only brain waves; and a computer, not a potentially biased human test-giver, analyzed the results. It was based on the work of neuroscientists such as E. Roy John,
2
who is now one of the major pioneers of electroencephalographic brain imaging; Duilio Giannitrapani, author of
The Electrophysiology of Intellectual Functions;
and David Robinson, author of
The Wechsler Adult Intelligence Scale and Personality Assessment
:
Toward a Biologically Based Theory of Intelligence and Cognition
and many other monographs famous among neuroscientists. I spoke to one researcher who had devised an IQ Cap himself by replicating an experiment described by Giannitrapani in
The Electrophysiology of Intellectual Functions
. It was not a complicated process. You attached sixteen electrodes to the scalp of the person you wanted to test. You had to muss up his hair a little, but you didn’t have to cut it, much less shave it. Then you had him stare at a marker on a blank wall. This particular researcher used a raspberry-red thumbtack. Then you pushed a toggle switch. In sixteen seconds the Cap’s computer box gave you an accurate prediction (within one-half of a standard deviation) of what the subject would score on all eleven subtests of the Wechsler Adult Intelligence Scale or, in the case of children,
the Wechsler Intelligence Scale for Children—all from sixteen seconds’ worth of brain waves. There was nothing culturally biased about the test whatsoever. What could be cultural about staring at a thumbtack on a wall? The savings in time and money were breathtaking. The conventional IQ test took two hours to complete; and the overhead, in terms of paying test-givers, test-scorers, test-preparers, and the rent, was $100 an hour at the very least. The IQ Cap required about fifteen minutes and sixteen seconds-it took about fifteen minutes to put the electrodes on the scalp—and about a tenth of a penny’s worth of electricity. Neurometrics’s investors were rubbing their hands and licking their chops. They were about to make a killing.