Cosmic Apprentice: Dispatches from the Edges of Science (18 page)

BOOK: Cosmic Apprentice: Dispatches from the Edges of Science
4.88Mb size Format: txt, pdf, ePub

Ever since Darwin, science’s star has been on the rise. Matters of religion, of ultimate reality as it were, have increasingly come to be seen as amenable to investigation and thus within science’s sweeping theoretic purview. This increase in scientific authority has been accepted even by religion itself. Most denominations of Protestantism, for example, now officially accept evolution, as does the Catholic Church, which also endorses modern cosmology’s idea of the Genesis-reminiscent big bang as well of course as Earth’s being round and in orbit around a star that is one of many billions. It seems surreal to contemplate that an early teacher of Copernicus’s heliocentric theory—author of
Dell infinito, universo e mondi
(Of infinity, the universe, and the world)—could be lured to Rome with the offer of a job and then turned over to the Inquisition who kept him chained up at the Castel Sant’Angelo for eight years awaiting his trial. Upon receiving his death sentence and telling the church it was more fearful than he, his jaw clamped shut, an iron spike was driven through Bruno’s tongue, and another spike was inserted into his palate. After being forced through the streets of Rome, the former Dominican friar was burned naked at the stake on February 17, 1600. Some report that the public threatened a boycott of such entertainments if heretics were so silenced, as the nonelective spike surgery had prevented the public from hearing their screams.

We may have come a long way, but we’re not there yet. As science becomes more specialized, as scientists become more powerful, and as their work and theories become more embedded in state apparatuses of corporate and government command and control, often requiring funding through universities run as businesses, human tendencies for dogmatism and hierarchy, if not burning at the stake, reappear at another level. Part of the rise of Protestantism was its protest against entrenched authority. The full beauty and truth of the universe cannot be found in a canonical text or by speaking to an imaginary person in the sky. Science is both an intellectual adventure and a spiritual experience. As the Protestants dispensed with priests to show that the individual could have a personal relationship with God, so science shows that anybody on the planet, employing the nondenominational method of science, can have a personal understanding of the cosmos. Science offers that potential, but the primary texts, if not in Latin, become daunting, mathematically and terminologically prohibitive to the average Joe. Still, we want to know, and turn to science writers, to journalists, and to scientists themselves to inform us on the latest and greatest ideas of our global and cosmic reality. Many people still turn avidly to the daily newspaper, although much of it is only gossip. As Stewart Brand has said, science is the only news. Who, what, when, how, why are we? What could be more thrilling? Yet who, as Plato said in
The Republic,
will watch the watchers? The spirit of science is one of questioning authority, putting ideas to the test, and rejecting them if they come up short. But as science assumes the status of cultural orthodoxy once reserved for priests and pontiffs, the stakes, so to speak, become greater, and the individual becomes increasingly ostracized from knowledge formation. The problem for science writers is especially acute, for, although they are the go-to sources for laypeople, they are not themselves qualified to question the new intellectual authorities. Their job has been described as Aaronic, in the sense of just conveying what authorities say, making it plain but not altering it. It is not their place to challenge, but only to spread the word.

Plato’s watchers thus are to be found largely only in the ranks of scientists themselves or, in academia, philosophers of science who tend to be marginalized within science. Among the canonical but questionable doctrines of modern science are the ideas that brain cells give rise to thought, the idea that genes are read-only and cannot be changed by proteins, the idea that Earth has an iron-nickel inner core, the idea that evolution proceeds by the gradual accumulation of random beneficial mutations, and the idea that life defies a universal tendency toward disorder. Of course not all scientists believe (or even know of) all these ideas, and many people contest some of them in accord with their own understanding, personal beliefs, or, despite the rise of science, religious faith. Then there are also marginalized “new age” and political beliefs, for example, alternative therapies for cancer, proposals for a non-HIV etiology of AIDS, and “conspiratorial” questions raised about governmental versions of terrorist events. In all these cases the stakes for the spirit of science as independent critical thinking are raised.

TRANSFORMATIVE FAILURES AND MODELS

“Failure is more beautiful than success,” poignantly wrote the novelist John Fante (1909–1983). And failure—a core process of learning, because it provides the negative feedback needed to adjust our ideas—
is
unfairly stigmatized. Learning itself is a matter of mistakes, as is, arguably, the evolutionary process, where new forms of living organization, successful variants, only exist and thrive against a background of less successful forms. So, too, from the profusion of neuronal and synaptic connections, much richer in the newborn infant than in the adult, unused pathways are weeded out in what the Nobel laureate Gerald Edelman has termed neural Darwinism. These unused connections that in essence atrophy also represent a kind of mistake-making algorithm, the neurophysiological substrate and context of perception.

The British economist Tim Harford, in his
Adapt: Why Success Always Starts with Failure,
has emphasized the social cost of the stigmatization of error, despite its status as the core context for progressive learning both personally and socially. Harford contrasts the relatively conservative ethos of funding practices at the NIH to the more mistake-welcoming institutional culture of the Howard Hughes Medical Institute, which makes more mistakes but is measurably more innovative in terms of patents and medical breakthroughs. The willingness to fund creative, quirky, eccentric, experimental, and maverick people who sometimes flout guidelines, color outside the lines, follow hunches, or make counterintuitive bets sometimes pays off big. But within institutional practices tending toward ossification and groupthink, such outliers are rarely nurtured; it may be impossible to differentiate genius from madness, and their flouting of convention can be considered irritating, inappropriate, or even threatening. The paleontologist Martin Brasier recounts how as a ship naturalist his relative contributions were so great that he was told to take it easy, the message being that he was making others look bad by comparison. In some institutional cultures, academics are warned not to evaluate prospective candidates too highly for similar reasons. (The great fictional exploration of this theme is Kurt Vonnegut’s story “Harrison Bergeron,” about a talented boy forced to fit in by wearing a helmet that banged his head, bringing his cognitive powers into line with the rest of the socius.) This tendency to reduce to the common denominator can be seen as a detriment to the production of variety, the “mistakes” on which learning and evolution thrives. I remember the former editor of the
Scientific American
book review column, the nuclear and astrophysicist Philip Morrison, quipping, “An expert is someone who makes all possible mistakes.” Morrison, who worked on the U.S. top-secret wartime Manhattan Project to build an atom bomb, physically helping load Fat Man onto the plane that dropped it on Nagasaki, later regretted his participation in that history-changing event as a mistake.

Science of course is arguably the greatest example of cultural learning based on mistakes. Unlike philosophy and religion, it formally integrates mistake making into its methodological structure. Although cultural historians and those in the discipline known as science studies dispute the official account of the scientific method—to forward and test hypotheses against the facts, and change the hypotheses if they don’t match the facts—as oversimplified or an idealized caricature, there is certainly some truth to it.

The Viennese philosopher Karl Popper is well-known for his description of the scientific method as being about falsification: A bona fide scientific theory must allow itself to be proven wrong; otherwise it is an untestable ideology and not real science. What is less well known is the biographic component of Popper’s influential theory: As a youth he was a committed communist, and when his friends analyzed the death of a close friend and comrade according to Marxist theory, Popper realized Marxism wasn’t science.

This close friend of Popper’s group was killed in a political demonstration in 1919 after some of Popper’s friends came to the aid of fellow communists attempting to escape from the Vienna police station. Although unarmed, several of the young socialists and communist workers were shot and killed. Popper felt horrified and guilty, and partly responsible. This was the same year that scientists in Britain had experimentally verified Albert Einstein’s relativity theory by measuring the curvature of light from another star as it passed by our sun during a total eclipse. Marxism, too, was presented as science, but was it?

“Marxist theory,” wrote Popper, “demands that the class struggle be intensified, in order to speed up the coming of socialism. Its thesis is that although the revolution may claim some victims, capitalism is claiming more victims than the whole socialist revolution. That was the Marxist theory—part of the so-called ‘scientific socialism.’ I now asked myself whether such a calculation could ever be supported by ‘science.’ The whole experience, and especially this question, produced in me a life-long revulsion of feeling.”
13
Unlike Einstein’s experimentally verifiable physics, there was nothing in Marxist ideology, no fact, experiment, or event, that could persuade the young Marxists that they were wrong. Popper, only seventeen at the time, came to believe that their friend had died in vain, for an unprovable belief system, not a verifiable science.

This sounds good in theory, but in practice the cutoff point between science and pseudoscience or ideology is more difficult to determine. Thomas Kuhn, in
The Structure of Scientific Revolutions,
argues that the progress of science is more dramatic, and discontinuous. A single or even a few counterfactual observations will not tend to bring down a vigorous scientific theory in one fell swoop. Rather, what tends to happen, according to Kuhn’s analysis, is that anomalies, perplexities, and inconsistencies gradually accumulate but are tolerated until the old theory collapses. Such
paradigm shifts
(a term that has entered the vernacular) occur not because of being falsified in a simple way but because of sociocultural and factual factors. Kuhn acknowledges his debt to Ludwig Fleck, author of
Genesis and Development of a Scientific Fact.
Fleck shows how what are considered objective scientific facts often depend on the prevailing “thought-style” of a like-minded scientific community. Thus, in medicine (Fleck’s specialty), some symptoms attributed to syphilis in fact resulted from the toxic mercury used to treat it. Popper’s falsification idea is good in theory, but in practice scientists tend to interpret anomalies in terms of prevailing theories, ignore discrepancies as errors, or invent ad hoc subtheories to account for conflicting evidence that is too plentiful to ignore. A valid version of the scientific reception of a new theory is that there are three phases. In the first, it is dismissed as flat-out wrong. In the second phase, it is considered correct but trivial. Finally, it is considered both true and important—but those who initially objected to it now claim they had known it all along. Unless they are dead—nature’s way of shaking up the dogma Etch A Sketch?—in which case they no longer have to be convinced.

Distressing as it is to those of us who would like to distinguish cleanly between truth and nontruth, science and pseudoscience, Fleck’s analysis of hierarchical human thought-collectives seems to be largely on point. Aware of the lability of the status of facts framed within human groups, he might have enjoyed the word
factish
put forth by the French philosopher of science Bruno Latour. As I borrowed it earlier to discuss the new fact(ishe)s of life, a factish is not a fact of nature that exists as if it were isolated from human thoughts and perceptions; rather, it is something that we believe is real and independent, but if we look at it more closely we can see it depends on a specific historical scientific community and its thought-styles. One of Stengers’s examples is the neutrino, which has never been seen in practice but whose traces on measurement devices within a certain context of physical investigations have sufficed to confirm its existence as objectively real.

Epistemology is the branch of philosophy that covers how we know what we know, and it is itself of course buffeted by many questions. The late atmospheric chemist Robert Garrels once told me, “Nobody believes anybody else’s models; they are just a convenient place to store our data.” This is a surprisingly pithy and revealing formulation, one that reminds us of the epistemological status of models—not as truths, per se, but as reserves of ever more abundant data. Garrels’s homespun philosophy of science also highlights the suspect ascendancy of models in modern science. Models are meant to represent reality in the sense of simulating aspects of it, ideally in predictable ways.

What models are not is epistemologically ambitious. In other words, they have, in a sense, lost contact with science’s guiding ideal, which is to find the truth, as Bohm says, “whether we like or not.” Instead, models aim to make reasonable, mathematically grounded, simulacra of the physical interactions of objects. What they’ve given up, perhaps following the lead of quantum mechanics’ Copenhagen interpretation, which suggests it’s useless to seek a visualizable reality beyond mathematics’ representation of subatomic particle behavior, is to find out “how things really are.”

EARTH-CHANGING VIEWS

The maverick nuclear physicist J. Marvin Herndon laments this descent into a world of scientific modeling, which, satisfied with reasonably coherent abstractions, has lost touch with science’s original mandate to discover, rather than just represent, reality. And Herndon points out, adding a somber warning note to Garrels’s insider’s view of scientific epistemology, that the ascendancy of modeling has pernicious consequences. Nor is it just modeling. The entire character of science has been changed by governmental support, corporate connectedness, and groupthink–nurturing institutionalization. In a sense it has always been this way. Neither Niels Bohr, supported by the Carlsberg brewery, nor Einstein, working in a patent office, received government money to make their great discoveries.

Other books

Compulsion by Jonathan Kellerman
Twinmaker by Williams, Sean
Your Worst Nightmare by P.J. Night
Butterface by Gwen Hayes
Dark Passion Rising by Shannan Albright
The Saffron Gate by Linda Holeman
They All Fall Down by Roxanne St. Claire