Read It's a Jungle in There: How Competition and Cooperation in the Brain Shape the Mind Online
Authors: David A. Rosenbaum
The tendency to miss the added information provided by the flower worn by the newcomer is illustrative of a phenomenon called
blocking
. It’s a phenomenon discovered by Leon Kamin, a psychologist at Princeton University.
26
The term “blocking” refers to the fact that new conditioned stimuli don’t promote learning when they’re added to other conditioned stimuli that have already been associated with an unconditioned stimulus. The new conditioned stimuli are blocked.
Blocking, like so many phenomena described in this book, can be understood in terms of inner agents competing and cooperating with one another. When conditioned stimuli are just being learned, there’s plenty of room for new connections (cooperation). Later, when those connections have become entrenched, the network is less plastic; competition rears its head. New information is harder to integrate because potential niches have been spoken for. The agents within those niches have become so well adapted that they strongly resist challenges to their domain. Agents attempting to enter the niches (speaking metaphorically, of course) have little chance of doing so. They are blocked.
27
Blocking can be viewed as a phenomenon of late learning, a phenomenon that suggests little openness to new information. Early learning and mid-course learning reveal greater openness to new information, as captured by another principle of conditioning, the Rescorla-Wagner model. According to this model, the strength of a conditioned response grows over time, but the rate of strengthening decreases as learning continues (
Figure 11
). The model was proposed by Robert Rescorla of the University of Pennsylvania and Alan Wagner of Yale University.
28
FIGURE 11.
Strength of associations for conditioned stimuli as a function of number of learning trials according to the Rescorla-Wagner model.
How can the Rescorla-Wagner model be explained? Suppose there is some number of neural connections that need to be formed for a classical conditioning task to be fully acquired. Early in learning, the chance of forming a connection is high because there are many available slots, but as learning continues, the chance of forming a connection gets smaller because fewer slots are open. If each new connection strengthens the response by some amount, the increase in response strength should decrease as fewer connections can be made.
The Rescorla-Wagner model is visually described by a curve that rises steeply at first and then levels off (again see
Figure 11
). This same shape arises in other learning contexts, a fact that may be taken to suggest that the same dynamic underlies them as well. Not surprisingly, I think that dynamic is the one captured by the jungle principle.
Consider another context in which learning more and more is associated with gaining less and less. The context is skill learning. There, it turns out, the speed of performing a task increases a lot in the early phase of practice but then increases by smaller and smaller amounts as practice continues (
Figure 12
). Saying this another way and focusing not on the speed of performing a task but on its inverse, the time for task performance, the time to perform a task decreases a lot at first but then decreases less and less as practice goes on.
FIGURE 12.
Performance speed (left panel) and, equivalently, performance time (right panel) as a function of amount of practice.
This relation was made famous in a study of the time taken by factory workers to roll and place cigars in cigar boxes.
29
The time to complete the cigar-rolling task dropped precipitously at first but then decreased at a lower and lower rate as the workers continued to perform the task over and over again.
This same pattern has been observed in many other tasks, including elementary button-pressing tasks,
30
reading inverted and reversed text,
31
justifying mathematical proofs,
32
and writing books by a prolific author (Isaac Asimov).
33
The curves posited in the Power Law of Learning and the Rescorla-Wagner model look similar. Given the resemblance of the curves, it’s tempting to think they might be rooted in the same neural changes. I think they are. The idea, consistent with the jungle principle, is that early in learning, the chance of forming a connection is high since there are many available slots, but as learning continues, the chance of forming a connection gets smaller, owing to the smaller number of slots still available. If each new connection strengthens the response, the increase in response strength is large at first but gets smaller as fewer connections can be made. Similarly, if
speed
of performance is related to response strength, then performance speed increases a lot at first but continues at a declining rate as practice goes on, for the same reason.
34
,
35
In the last section, I expressed interest in the fact that the form of the relation between response
strength
and amount of practice is similar to the relation between response
speed
and amount of practice. In both cases, the curves are steep at first and then flatten out. In this section, I focus on the fact that forgetting behaves in much the same way.
Recall from earlier in this chapter that the longer ago someone graduated from high school, the smaller the number of classmates he or she could recall. The function relating number of names recalled to time since graduation plummets at first and then levels off. Again, there is rapid initial change followed by smaller and smaller change. This pattern has been observed so often that psychologists speak of it as “the forgetting curve” (
Figure 13
).
Why does the forgetting curve have the form it does? Might it be that the sheer passage of time accounts for it? This is doubtful, because time itself doesn’t account for degradation. In physics, with its famous radioactive decay, it is electrons fleeing the coop, not time
per se
, that accounts for time-related decay. In biology, where rotting is a fact of life (or its aftermath), bacterial infestation does the dirty work; time
per se
doesn’t. In both domains, things fall apart because of processes occurring
in
time.
In psychology, experiments have likewise shown that time
per se
does not account for forgetting. In one experiment, students at the University of Michigan were asked to learn a list of words.
36
After being exposed to the words, some participants spent several seconds listening for a tone, tapping a key whenever they heard it. Other participants, after being exposed to the same words, spent the same amount of time listening for a syllable that sounded like one or more of the words in the to-be-remembered list. The syllable-monitoring participants were likewise supposed to tap a key when they heard the target. When both groups were later asked to recall the words, those in the tone-monitoring condition did much better than those in the syllable-monitoring condition. The same amount of time passed in both cases, so it was not the sheer passage of time that predicted forgetting. Rather, it was what happened during the retention interval.
FIGURE 13.
Amount forgotten (left panel) and, equivalently, amount retained (right panel) as a function of time since learning.
The same result was reached in another study, conducted in the 1920s.
37
Participants were presented with a list of words, either at the beginning of a normal, busy day or before going to sleep. The subjects retained far more words if they learned the words before going to sleep than before going to work. The amount of time between study and test was the same in the two conditions, so once again it wasn’t the sheer passage of time that predicted forgetting. What occurred during the retention interval determined how much was retained. When fewer things happened—in sleep rather than in daily activity—more information could be recalled later.
If the passage of time doesn’t account for forgetting, what does? Radioactive decay and bacterial infestation aren’t plausible answers. The best answer, I believe, is
interference
.
38
Here, competing material crowds out what’s to be learned. The more competitive the material, the greater its crowding effect. So in the first study mentioned above, listening for a syllable like one in the list created a more competitive environment for the to-be-remembered words. In the second study, pursuing one’s daily activities created a more competitive environment for the words to be remembered than did sleeping.
39
A great deal of additional evidence supports an interference account of forgetting. At the same time, no evidence that I’m aware of contradicts the interference account.
40
I won’t rehearse all the research here. I’ll just mention one other demonstration of interference that attests to its importance. It is described in the next section. What makes the study noteworthy is that it provides an exception to the rule that recognition is easier than recall.
When you try to recall something as opposed to recognizing it, you’ve got much more to do. Recall requires generating a response—saying the item’s
name or typing it or signing it or singing it. By contrast, recognition requires generating a less specific response—saying “Yes, I saw it,” or “No, I didn’t,” for example. As mentioned earlier, recognition is generally better than recall, but can actually be worse when interference is intense.
My favorite demonstration of this outcome is a study in which college students were asked to learn paired associates such as “river-bank.” Later, the students were tested with a cued-recall procedure in one condition or with a recognition procedure in the other. In the cued-recall procedure they were given the first word in a pair and were asked to come up with the second word. When they were cued with “river,” most subjects managed to recall “bank.” In the recognition condition, the students were shown two words and were asked to indicate whether they recognized either one. When the word pair was “river-bank,” they reached high levels of recognition, but when the word pair was “piggy-bank,” recognition failed miserably. Though the word “bank” was in the original list, the meaning of “bank” was different when paired with “piggy” than when paired with “river.” Cuing the wrong sense of “bank” caused participants to miss it. Evidently, the original memory of “bank” was obstructed by the new one. The interference was so strong that recognition failed in this instance.
41
The “river-bank/piggy-bank” study is a clever demonstration of the extent to which conditions of test need to match conditions of study for memory performance to thrive. If conditions of test differ radically from conditions of learning, retrieval can suffer.
42