Read Autopilot Online

Authors: Andrew Smart

Tags: #Bisac Code 1: SCI089000 / SEL035000

Autopilot (3 page)

BOOK: Autopilot
7.6Mb size Format: txt, pdf, ePub
ads

But it turns out that the brain is not just sitting there waiting for the next stimulation. Rather, the brain is perpetually and spontaneously active. It is maintaining, interpreting, responding, and predicting. In fact, the brain uses more energy for spontaneous, intrinsic activity than for accomplishing specific tasks such as multiplying eight and seven, or filling in the cells of a spreadsheet. According to renowned neuroscientist György Buzáki, professor at the Rutgers
Center for Molecular and Behavioral Neuroscience
, most of the brain's activity is generated from within. External inputs usually cause only minor perturbations from the brain's internally controlled program. Make no mistake: external perturbations are critical in order for the brain to develop normally. No brain can develop in isolation; the brain needs to be “calibrated” to the external world through experience. Nevertheless, the brain as a complex system keeps itself in balance through self-generated patterns. As I mentioned, the concepts behind these insights into brain function come from fields outside psychology and neuroscience, such as complex systems science and physics. We are just beginning to understand what the brain's spontaneous activity really means. We explore the resting brain and its role in creativity in more detail in
Chapter 2
and
Chapter 6
.

What emerges, though, is the idea that perceptions, memories, associations and thoughts may need a resting mind in order to make their way through our brain and form new connections. Eastern traditions have been aware of this through meditative practices for thousands of years. In Buddhism, monks train to calm their minds. Western society has instilled in us a belief that every moment of every day must be filled with activity. Indeed, it is almost a moral obligation in the US to be as busy as possible. I will try to show that for certain things the brain likes to do (for example, coming up with creative “outside of the box” solutions) you may need to be doing very little.

When your brain is bombarded with stimuli like emails, phone calls, text messages, Facebook updates, errands, driving around, talking to your boss, checking your to-do list, etc., it is kept busy responding to what neuroscientist Scott Makeig, director of the
Swartz Center for Computational
Neuroscience in La Jolla, California, calls “the challenge of the moment.” Clearly, it is very important to be able to respond to the moment. Sometimes our survival depends on the ability to successfully meet this challenge. However, if that moment becomes every minute of every day of every month of every year, your brain has no time left over to make novel connections between seemingly unrelated things, find patterns, and have new ideas. In other words, to be creative.

Thinkers such as Bertrand Russell, Rilke, and Oscar Wilde may have been tapping into something that is only now being revealed by modern neuroscience. Each of these thinkers, and many more, asserted throughout their lives that each person could only reach his or her potential through leisure. That may sound paradoxical; after all, we are taught from a very young age some variation of “the Devil finds work for idle hands.” But given the view of our brains that is emerging from modern neuroscience, it may be no accident that as our working hours increase, our mental well-being and physical health decrease.

The human brain is unique in the animal kingdom for its ability to come up with novel solutions to problems. Animals, especially non-human primates, are certainly creative. However, they are only creative within the narrow limits of their own cognitive and perceptual worlds. Humans have invented technology to extend our perception to invisible parts of the electromagnetic spectrum, and soon we may even be able to extend our memory and cognition using neurotechnology. Many neuroscientists argue that humans are unique in the degree to which we are conscious. Humans are the only species that have created a communication system that allows us to create art and acquire complex bodies of knowledge.

We are now using our brains to try to understand our brains. Another unique thing about humans is that we can afford to be lazy because of our technology and culture. We might think that an elephant seal lounging around on a California beach is being lazy. However, nothing could be further from the truth. The seal is preserving precious body fat and energy for when he has to hunt in frigid water or avoid sharks.

How did we become convinced that idleness is evil? Idleness has always been feared in the United States. The Puritans believed that hard work was the only way to serve God. Going back to 16th-century Europe where Puritanism has its roots, Luther and Calvin both believed that constant work was ordained by God and they commanded each person to choose a job and work at it “as a century post so that he may not heedlessly wander about.” Forced labor was even encouraged for the poor and unemployed as a way to keep them on “the path of righteous living.” During Luther's lifetime, Europe was urbanizing and its population expanding rapidly. This led to overcrowded cities, high unemployment, and inflation. There was an explosion in the number of urban poor in places like London, Venice, and Amsterdam. Unable to grasp macroeconomics, zealots like Luther saw the new urban poor masses as “indifferent idlers” who should be punished with toil for their original sin of laziness.

We can trace the roots of our current obsession with work and effectiveness to Luther's misperception that poverty is caused by laziness rather than complex socio-economic circumstances.
3
Idleness came to be seen as an evil. If only Luther had been trained as a sociologist, we might have more than two weeks of vacation every year.

The consequences of Luther's rabid anti-idleness philosophy, especially in the United States, are seen in our absurdly short vacations and our compulsive work ethic. (Not that the United States is alone in this obsession; the Japanese have even coined the term “karoshi,” which means “death from overwork.”)

The increase in working hours is also striking given the recent explosion of time management, “get-everything-done-right-now” books and seminars on the market. On Amazon, I counted over ninety-five thousand books on time management. You would need to be very skilled at time management to read all of the time management books on Amazon. Assuming the average length of a book is two hundred pages, that's nineteen million pages of time management material to read. You would have to read about three time management books a day for seventy-two years to get through them all.

If these books are really effective at making us more effective, then why are we working more hours? Why does study after study show that we are more stressed, have worse family relationships, weigh more, and are less happy because we are working too much? Does it seem odd that as the time management industry sells more books, the number of hours we work increases? To quote Bertrand Russell, “can anything more insane be imagined?”

Could it be that we just aren't getting the message? Do we need even more time management books and Six Sigma courses? That is certainly what the evangelical time management industry wants us to believe. Is it the case that if we could just get more done, we could have more time off?

On the contrary, I believe there is a fundamental contradiction underlying the relationship between our culture of time management and the number of hours that professionals work. The more effective we become, the more we are pressured to do. It is an endless cycle. It stems from our belief that time cannot be wasted under any circumstances. However, wasted time is not an absolute value like mass. You can only waste time relative to some context or goal. While you are reading this book, you are wasting time relative to your goal of getting to the store before you have to pick up your kids. In fact, from some perspective, you are always wasting time.

A scientific view of the brain is incompatible with the Lutheran or Christian view of man, and this view is also incompatible with our work ethic. The much-vaunted work ethic is, like slavery, a systematic cultural invention that resulted from a commonly held, but mistaken, idea about human beings. We look back at the slavery system now and think it ridiculous and appalling. It is clear to us now how wrongheaded the very idea of slavery was. One day, we may look back at our work ethic in much the same way. Once we correct certain errors in our beliefs about our brains, our overworked society will appear to future generations as ridiculous and appalling.

In the early 1990s, Steve Sampson, an anthropology professor of mine, was recruited as a consultant for a Danish computer company. The Danish company was hired by a company in Romania to modernize its operations. The Danes installed computers and an IT department. Everything seemed to function as planned, but a problem arose. After the computer system was activated and the employees were trained, people started leaving work at lunch time. Puzzled, the Danish managers asked why the Romanians were leaving halfway through the work day. The Romanians explained that the computers enabled them to do a whole day's work in half a day, so when they were finished with their work they went home. My professor, an anthropologist, was brought in to help solve the minor crisis that ensued. The Danes were baffled that the Romanians did not want to do twice as much work now that they had computers, and the Romanians thought the Danes were crazy for expecting them to do twice as much work just because they could do it faster. This example illustrates a cultural gap, but also that technology such as PCs that are ostensibly supposed to give us more free time actually either reduce our leisure time or eliminate it.

Many of us read the summaries of scientific health studies that appear in popular magazines or the
New York Times
. Some of us try to implement the suggestions that researchers make about how to eat healthier, how to exercise, how to avoid cognitive decline as we age, how to educate our children, how to sleep better, how to avoid getting diabetes, how to avoid knee problems from running, etc. This book should be read similarly, as a how-to book about how to do nothing. Obviously, the “how-to” part is easy. The “why” part will take some explanation. Idleness may be a loathsome monster, but it's a monster you should get to know.

From an evolutionary perspective, going back a couple of million years, when homo sapien-like species were beginning to evolve more advanced cultures, one thing that distinguished us from the apes was the ability to plan for the future.

For example, apes are known to be proficient tool users, but they only seem to use the tools in their immediate vicinity. Chimpanzees often use nearby twigs to lure ants out of a colony. But no chimpanzees have been seen to carry a twig for miles, knowing that they might get hungry later and there might be an ant colony along the way.

The first hominid species actually started carrying tools to places where they knew the objects would be useful (as opposed to just using tools in the immediate area). This indicates that their brains had developed the capacity to represent the fact that at some point in the future they might want to eat, even though right at this moment they might not be hungry. So rather than being driven by their current state, i.e., hunger, early humans began to prepare for future states.

This necessarily requires more memory to represent the past and the future. The ability to plan for future states of hunger, cold, or thirst as opposed to just reacting to immediate desires, is perhaps what began the rapid cultural advance of human beings.

It is interesting to muse about when the concept of work coalesced in human culture. Presumably it would have been after the evolution of language. It is doubtful chimpanzees have any concept of work, but they are very social and there is some evidence that they can plan for the future to a very limited degree.

Our hominid line broke with chimps about five to seven million years ago, and something starting to resemble human culture began about 1.8 million years ago. Language is more recent. So when did “work” as something onerous and obligatory replace just being active as a function of external or internal stimuli? There must be some higher-order conscious reflection that is necessary to be able to say that you are working as opposed to doing nothing, or just trying to satisfy your hunger.

The other side of the idleness-is-good-for-the-brain coin is that our brains come with design limitations. In much the same way that James Cameron could not have made
Avatar
on one normal computer, an individual human brain can handle only so much information.

Our brains took millions of years to evolve in very different types of environments than, for example, a modern office. Humans only began reading and writing about five thousand years ago. This is why it is still such a struggle for us to learn how to read. We lack genetically specified neuronal structures for reading, and our brains have to recycle other brain structures when we learn to read. Speaking, on the other hand, evolved much earlier and we normally do not have to struggle to learn how to speak. There are stages to language acquisition that happen whenever a healthy brain develops in a language community, e.g., English, Spanish, or Chinese.

We have specialized brain structures that are attuned to speech perception and speech production. By the time we reach adolescence, we have mastered our native language without any special instructions. However, in contrast, many otherwise healthy people with normally functioning brains reach adulthood not being able to read.

I point this out because our modern way of life and our work ethic are much more recent cultural inventions than reading. Swedish neuroscientist Torkel Klingberg calls this “The Stone Age brain meeting the Information Age.” For example, we do not have genetically specified brain structures for multitasking, and studies now show that multitasking makes you worse at each thing you are simultaneously attempting to do.

In a famous series of studies, Stanford professor of communication Clifford Nass wanted to find out what gives multitaskers their proclaimed abilities. Professor Nass marveled at his colleagues and friends who claimed to be expert multitaskers, people who chat with three people at a time, while answering emails and surfing the web.

BOOK: Autopilot
7.6Mb size Format: txt, pdf, ePub
ads

Other books

A Lethal Legacy by P. C. Zick
St. Nacho's by Z. A. Maxfield
Leave it to Eva by Judi Curtin
Rogue by Mark Frost
Everything She Forgot by Lisa Ballantyne
The Master's Exception by Veronica Angel