Authors: Sebastian Seung
The Serenity Prayer, quoted at the beginning of Chapter 2, echoes the sentiments of an older rhyme:
Â
For every ailment under the sun
There is a remedy, or there is none;
If there be one, try to find it;
If there be none, never mind it.
Â
That kind of mixed message is also on display in the self-help section of your local bookstore. Browse for a few minutes and you'll come across many books that don't tell you how to change; instead, they teach resignation. If you're persuaded that you can't possibly change your spouse, you may stop nagging and learn to be happy with your marriage. If you believe that your weight is genetically determined, you may cease dieting and enjoy eating once again. On the other end of the spectrum, diet books like
I Can Make You Thin
and
Master Your Metabolism
are titled to inspire optimism about losing weight. In his guide to self-help books,
What You Can Change and What You Can't,
the psychologist Martin Seligman lays out the empirical evidence for pessimism. Only 5 or 10 percent
of people actually achieve long-term weight loss by dieting. That's a depressingly low number.
So is change really possible? The twin studies showed that genes may influence human behavior but do not completely determine it. Nevertheless, another type of determinism has emerged, this one based on the brain, and almost as pessimistic. “Johnny's just that wayâhe's wired differently,” you hear people say. Such
connectome determinism
denies the possibility of significant personal change after childhood. The idea is that connectomes may start out malleable but become fixed by adulthood, in line with the old Jesuit saying, “Give me the child until he is seven and I'll give you the man.”
The most obvious implication of connectome determinism is that changing people should be easiest in the first years of life. The construction of a brain is a long and complex process. Surely it's more effective to intervene during the early stages of construction, rather than later on. While a house is being built, it's relatively easy to deviate from the architect's original blueprint. But as anyone who has remodeled a house knows, it's much harder to make major changes after the house is finished. If you've tried to learn a foreign language as an adult, you may have found it a struggle. Even if you were successful, you probably didn't end up sounding like a native speaker. Since children seem to learn second languages effortlessly, their brains appear to be more malleable. But does this idea really generalize to mental abilities other than language?
In 1997, thenâFirst Lady Hillary Clinton hosted a conference at the White House entitled “What New Research on the Brain Tells Us about Our Youngest Children.” Enthusiasts of the “zero-to-three movement”
gathered to hear claims that neuroscience had proven the effectiveness of intervening during the first three years of life. At the conference was the actor and director Rob Reiner, who started the I Am Your Child Foundation, also in 1997. He was beginning to create a series of educational videos for parents about the principles of childrearing. The inaugural title was “The First Years Last Forever,” which sounded ominously deterministic.
Actually, neuroscience has been unable to confirm or deny such claims, because it's been difficult to identify exactly what changes in the brain cause learning. Could the zero-to-three movement base its claims of determinism on the neo-phrenological theory that learning is caused by synapse creation? (Let's ignore the considerable evidence against this theory, for the sake of argument.) The answer would be yes if synapse creation were impossible in adults. But William Greenough and other researchers showed that connection number still increases even when adult rats are placed in enriched cages. The rate was slower than in young rats, but still substantial. And remember the MRI studies of the cortex in people learning to juggle? Thickening occurred in the elderly as well as young adults.
Finally, watching synapses through a microscope has shown that reconnection still continues in the brains of adult rats, as mentioned previously. Neuroscientists have not demonstrated a drop in reconnection with age as dramatic as the decrease in language-learning ability. Therefore, the first form of connectome determinism, “reconnection denial,” does not seem tenable.
A second form has emerged, however: “rewiring denial.” The “wires” of the brain are laid down in early life, as neurons extend axons and dendrites. Retraction of branches also occurs during development. Using microscopy, researchers have been able to capture videos of these remarkable processes.
Often the tip of an axon makes a synapse onto a dendrite, gripping as if the synapse were like a hand. The creation of such a synapse appears to stimulate the axon to grow further, but if such a synapse is eliminated, the axon loses its hold and retracts. In general, it seems that axonal branches can't be stable unless they make synapses. Although growth and retraction are highly dynamic in the young brain, rewiring deniers believe that they grind to a halt in the adult brain. The wires can be reconnected in new ways by synapses, and synapses can be reweighted by changing their strengths, but the wires themselves are fixed.
Rewiring is hotly debated because of its suspected role in remapping, the dramatic changes in function observed after brain injury or amputation. To understand the importance of rewiring, we need to revisit a more fundamental question: What defines the function of a brain region?
Â
The whole notion of a brain region with a well-defined function implicitly depends on an empirical fact. Through measurements of neural spiking, it has been shown that neurons near each other in the brain (neighboring cell bodies) tend to have similar functions. One can imagine a different kind of brain in which neurons are chaotically scattered without any regard for their functions. It wouldn't make sense to divide such a brain into regions.
But why do the neurons in a region have similar functions? One reason is that most connections in the brain are between nearby neurons.
This means neurons in a region “listen” mainly to each other, so we'd expect them to have similar functions, much as we'd expect less diversity of opinions among a group of people who mainly keep to themselves. This is part of the story, but not all of it.
The brain also contains some connections between distant neurons. In effect, neurons in the same region “listen” to neurons in other regions as well as each other. Couldn't these faraway sources of input lead to diversity? Indeed they could if they were distributed all over the brain, but in fact they are typically confined to a limited number of regions. Returning to the social analogy, you could imagine a brain region as a group of people who listen to the outside world a bit, but only by reading the same newspapers and watching the same television shows. These external influences are so narrow that they don't lead to diversity either.
Why are long-range connections constrained in this way? The answer has to do with the organization of brain wiring. Most pairs of regions lack axons running between them, so their neurons have no way of connecting with each other. In other words, any given region is wired to a limited set of source and target regions. This set has been called a “connectional fingerprint,” as it appears to be unique for each region. The fingerprint is often highly informative about the region's function. For example, the reason that Brodmann area 3 mediates bodily sensations, a function I mentioned earlier, is that this area is wired to pathways bringing touch, temperature, and pain
signals from the spinal cord. Similarly, the reason that Brodmann area 4 controls movements of the body is that this area sends many axons to the spinal cord, which in turn is wired to the muscles of the body.
These examples suggest that a region's function depends greatly on its wiring with other regions. If that's true, altering the wiring could change the function. Remarkably, this principle has been demonstrated by “rewiring” a nominally auditory area of the cortex to serve the function of vision. The first step was taken in 1973 by Gerald Schneider,
who discovered an ingenious method to reroute axons growing in the brains of newborn hamsters. By damaging certain brain regions, he diverted retinal axons from their normal target in a visual pathway to an alternative destination in an auditory pathway. This had the effect of sending visual signals to a cortical area that is normally auditory.
The functional consequences of this rewiring were investigated in the 1990s by Mriganka Sur and his collaborators. After repeating Schneider's procedure in ferrets, they showed that neurons in the auditory cortex now responded to visual stimulation. Furthermore, the ferrets could still see even after the visual cortex
was disabled, presumably by using their auditory cortex. Both pieces of evidence implied that the auditory cortex had changed its function to be visual. Similar “cross-modal” plasticity has also been observed in humans. For example, in those who are blind from an early age, the visual cortex is activated when they read Braille
with their fingertips.
Such findings are consistent with Lashley's doctrine of equipotentiality, but they suggest an important qualification: A cortical area indeed has the potential to learn any function, but only if the necessary wiring with other brain regions exists. If every area in the cortex were wired to every other area (and to all other regions outside the cortex), then equipotentiality might hold without any provisos. Wouldn't the brain be far more versatile and resilient if its wiring were “all to all”? Maybe so, but it would also swell to gigantic proportions. All those wires take up space, as well as consume energy. The brain has evidently evolved to economize, which is why the wiring between regions is selective.
The Schneider and Sur experiments induced young brains to wire up differently. What about the adult brain? If the wiring between regions becomes fixed in adulthood, that would constrain the potential for change. Conversely, if the adult brain could rewire, it would have more potential to recover from injury or disease. This is why researchers so badly want to know whether rewiring is possible in adulthood, and also find therapies to promote the phenomenon.
Â
In 1970, a thirteen-year-old girl came to the attention of social workers in Los Angeles. She was mute, disturbed, and severely underdeveloped. Genie (a pseudonym) had been a victim of terrible abuse. She had spent her entire life in isolation, tied up or otherwise confined to a single room by her father. Her case aroused great public attention and sympathy. Doctors and researchers hoped that she could recover from her traumatic childhood, and they resolved to help her learn language and other social behaviors.
Coincidentally, 1970 also saw the premiere of François Truffaut's film
L'Enfant Sauvage,
about the Wild Boy of Aveyron. Victor was discovered around 1800 wandering naked and alone in the woods of France. Efforts were made to “civilize” him, but he never learned to speak more than a few words. History has recorded other examples of so-called feral children, who grew up lacking exposure to human love and affection. No feral child was ever able to learn language.
Cases like Victor's suggested the existence of a
critical period
for the learning of language and social behaviors. Deprived of the opportunity to learn during the critical period, feral children could not learn
these behaviors later on. In metaphorical terms, the door to learning hangs open during the critical period; then it swings shut and locks. While this interpretation is plausible, too little is known about feral children for it to be scientifically rigorous.
When Genie was found, researchers hoped that her case might overturn the theory of the critical period. They resolved to study Genie and rehabilitate her at the same time. She made some encouraging progress in learning language, but eventually funding for the research dried up. Then Genie's life took a tragic turn
as she passed through a series of foster homes and seemed to regress.
Around the time the research ended, scientific papers reported that Genie was still learning new words but was struggling with syntax. According to later popular accounts, the researchers became discouraged, predicting that she would never learn real sentence structure.
Whether Genie would have progressed further will never be known. She provided some evidence for a critical period in language learning, but it is difficult to draw firm scientific conclusions, however heartbreaking and gripping her case may be.
Optometrists encounter less harrowing forms of deprivation all the time. Weak vision in one eye often goes unnoticed if the other eye provides clear sight. Wearing eyeglasses or having a cataract removed easily corrects the problem with the eye. Nevertheless, the patient may still not see clearly with the corrected eye, or be stereo-blind, because there is still something wrong with the brain. (At a movie theater you've probably tried 3D glasses, which give a sensation of depth by presenting slightly different images to the two eyes. Those who can't perceive 3D in this way are said to be stereo-blind.) The condition, known as amblyopia to specialists, is nicknamed “lazy eye,” but the disorder involves the brain as well as the eye.
Amblyopia suggests that we are not simply born with the ability to see; we must also learn from experience, and there is a critical period for this process. If the brain is deprived of normal visual stimulation from one eye during this limited time window, it does not develop normally. The effect is irreversible in adulthood. Children, however, recover normal vision if amblyopia is detected and treated early; their brains are still malleable. On the flip side, if an adult develops poor vision in a single eye, it has no lasting effect on the brain. Correcting the eye produces full recovery.
Amblyopia seems to document the claim made in the title of Rob Reiner's video,
The First Years Last Forever
. Early intervention is crucial, as the zero-to-three movement contends. Amblyopia treatments suggest that the brain becomes less malleable after the critical period. But can that be shown directly by neuroscience? How exactly do poor vision and corrected vision change the brain during the critical period, and why don't these changes happen later on?