Authors: Chuck Klosterman
Fargo Rock City: A Heavy Metal Odyssey in Rural NÃ¶rth DakÃ¶ta
Sex, Drugs, and Cocoa Puffs: A Low Culture Manifesto
Killing Yourself to Live: 85% of a True Story
Chuck Klosterman IV: A Decade of Curious People and Dangerous Ideas
Eating the Dinosaur
I Wear the Black Hat: Grappling with Villains (Real and Imagined)
This is not a collection of essays.
It might look like a collection of essays, andâat timesâit might feel like a collection of essays. But that is not the intention.
Obviously, you can read this book however you choose. I can't demand people read this book in sequential order, nor can I stop anyone from skipping around and reading random chapters in whatever insane pattern they desire. You can read it backward, if that's your preference. But it will make more sense if you don't.
This is not a collection of
An imprint of Penguin Random House LLC
375 Hudson Street
New York, New York 10014
Copyright Â© 2016 by Chuck Klosterman
Penguin supports copyright. Copyright fuels creativity, encourages diverse voices, promotes free speech, and creates a vibrant culture. Thank you for buying an authorized edition of this book and for complying with copyright laws by not reproducing, scanning, or distributing any part of it in any form without permission. You are supporting writers and allowing Penguin to continue to publish books for every reader.
Blue Rider Press is a registered trademark and its colophon is a trademark of Penguin Random House LLC
eBook ISBN 9780399184147
For Silas and
If what I say now seems to you to be very reasonable, then I'll have failed completely.
âArthur C. Clarke, speaking in the year 1964, attempting to explain what the world might be like in the year
I've spent most of my life being wrong.
Not about everything. Just about most things.
I mean, sometimes I get stuff right. I married the right person. I've never purchased life insurance as an investment. The first time undrafted free agent Tony Romo led a touchdown drive against the Giants on
Monday Night Football
, I told my roommate, “I think this guy will have a decent career.” At a New Year's Eve party in 2008, I predicted Michael Jackson would unexpectedly die within the next twelve months, an anecdote I shall casually recount at every New Year's party I'll ever attend for the rest of my life. But these are the exceptions. It is far, far easier for me to catalog the various things I've been wrong about: My insistence that I would never own a cell phone. The time I wagered $100âagainst $1âthat Barack Obama would never become president (or even receive the Democratic nomination). My three-week obsession over the looming Y2K crisis, prompting me to hide bundles
of cash, bottled water, and Oreo cookies throughout my one-bedroom apartment. At this point, my wrongness doesn't even surprise me. I almost anticipate it. Whenever people tell me I'm wrong about something, I might disagree with them in conversation, butâin my mindâI assume their accusation is justified, even when I'm relatively certain they're wrong, too.
Yet these failures are small potatoes.
These micro-moments of wrongness are personal: I assumed the answer to something was “A,” but the true answer was “B” or “C” or “D.” Reasonable parties can disagree on the unknowable, and the passage of time slowly proves one party to be slightly more reasonable than the other. The stakes are low. If I'm wrong about something specific, it's (usually) my own fault, and someone else is (usually, but not totally) right.
But what about the things we're
What about ideas that are so accepted and internalized that we're not even in a position to question their fallibility? These are ideas so ingrained in the collective consciousness that it seems foolhardy to even wonder if they're potentially untrue. Sometimes these seem like questions only a child would ask, since children aren't paralyzed by the pressures of consensus and common sense. It's a dissonance that creates the most unavoidable of intellectual paradoxes: When you ask smart people if they believe there are major ideas currently accepted by the culture at large that will eventually be proven false, they will say, “Well, of course. There must be. That phenomenon has been experienced by every generation who's ever lived, since the dawn of human history.” Yet offer those same people a laundry list of contemporary ideas that might fit that description, and they'll be tempted to reject them all.
It is impossible to examine questions we refuse to ask. These are the big potatoes.
Like most people,
I like to think of myself as a skeptical person. But I'm pretty much in the tank for gravity. It's the force most recognized as perfunctorily central to everything we understand about everything else. If an otherwise well-executed argument contradicts the principles of gravity, the argument is inevitably altered to make sure that it does not. The fact that I'm not a physicist makes my adherence to gravity especially unyielding, since I don't know anything about gravity that wasn't told to me by someone else. My confidence in gravity is absolute, and I believe this will be true until the day I die (and if someone subsequently throws my dead body out of a window, I believe my corpse's rate of acceleration will be 9.8 m/s
And I'm probably wrong.
Maybe not completely, but partially. And maybe not today, but eventually.
“There is a very, very good chance that our understanding of gravity will not be the same in five hundred years. In fact, that's the one arena where I would think that most of our contemporary evidence is circumstantial, and that the way we think about gravity will be very different.” These are the words of Brian Greene, a theoretical physicist at Columbia University who writes books with titles like
Icarus at the Edge of Time
. He's the kind of physicist famous enough to guest star on a CBS sitcom, assuming that sitcom is
The Big Bang Theory
. “For two hundred years, Isaac Newton had gravity down. There was almost no change in our thinking
until 1907. And then from 1907 to 1915, Einstein radically changes our understanding of gravity: No longer is gravity just a force, but a warping of space and time. And now we realize quantum mechanics must have an impact on how we describe gravity within very short distances. So there's all this work that really starts to pick up in the 1980s, with all these new ideas about how gravity would work in the microscopic realm. And then string theory comes along, trying to understand how gravity behaves on a small scale, and that gives us a descriptionâwhich we don't know to be right or wrongâthat equates to a quantum theory of gravity. Now, that requires extra dimensions of space. So the understanding of gravity starts to have radical implications for our understanding of reality. And now there are folks, inspired by these findings, who are trying to rethink gravity itself. They suspect gravity might not even be a fundamental force, but an emergent
force. So I do thinkâand I think many would agreeâthat gravity is the least stable of our ideas, and the most ripe for a major shift.”
If that sounds confusing, don't worryâI was confused when Greene explained it to me as I sat in his office (and he explained it to me twice). There are essential components to physics and math that I will never understand in any functional way, no matter what
I read or how much time I invest. A post-gravity world is beyond my comprehension. But the concept of a post-gravity world helps me think about something else: It helps me understand the pre-gravity era. And I don't mean the days before Newton published
in 1687, or even that period from the late 1500s when Galileo was (allegedly) dropping balls off the Leaning Tower of Pisa and inadvertently inspiring the Indigo Girls. By the time those events occurred, the notion of gravity was already drifting through the scientific ether. Nobody had pinned it down, but the mathematical intelligentsia knew Earth was rotating around the sun in an elliptical orbit (and that
was making this happen). That was around four hundred years ago. I'm more fixated on how life was another four hundred years before that. Here was a period when the best understanding of why objects did not spontaneously float was some version of what Aristotle had argued more than a thousand years prior: He believed all objects craved their “natural place,” and that this place was the geocentric center of the universe, and that the geocentric center of the universe was Earth. In other words, Aristotle believed that a dropped rock fell to the earth because rocks belonged on earth and wanted to be there.
So let's consider the magnitude of this shift: Aristotleâarguably the greatest philosopher who ever livedâwrites the book
and defines his argument. His view exists unchallenged for almost two thousand years. Newton (history's most meaningful mathematician, even to this day) eventually watches an apocryphal apple fall from an apocryphal tree and inverts the entire human understanding of why the world works as it does. Had this been explained to those people in the fourteenth century with no understanding
of scienceâin other words, pretty much everyone else alive in the fourteenth centuryâNewton's explanation would have seemed way, way crazier than what they currently believed: Instead of claiming that Earth's existence defined reality and that there was something essentialist about why rocks acted like rocks, Newton was advocating an invisible, imperceptible force field that somehow anchored the moon in place.
We now know (“know”) that Newton's concept was correct. Humankind had been collectively,
wrong for roughly twenty centuries. Which provokes three semi-related questions:
There's a popular website
that sells books (and if you purchased this particular book, consumer research suggests there's a 41 percent chance you ordered it from this particular site). Book sales constitute only about 7 percent of this website's total sales, but books are the principal commodity this enterprise is known for. Part of what makes the site successful is its user-generated content; consumers are given the opportunity to write reviews of their various purchases, even if they never actually consumed the book they're critiquing. Which is amazing, particularly if you want to read negative, one-star reviews of Herman Melville's
“Pompous, overbearing, self-indulgent, and insufferable. This is the worst book I've ever read,” wrote one dissatisfied customer in 2014. “Weak narrative, poor structure, incomplete plot threads, Â¾ of the chapters are extraneous, and the author often confuses himself with the protagonist. One chapter is devoted to the fact that whales don't have noses. Another is on the color white.” Interestingly, the only other purchase this person elected to review was a Hewlett-Packard printer that can also send faxes, which he awarded two stars.
I can't dispute this person's distaste for
. I'm sure he did hate reading it. But his choice to state this opinion in publicâalmost entirely devoid of critical context, unless you count his take on the HP printerâis more meaningful than the opinion itself. Publicly attacking
is shorthand for arguing that what
we're socialized to believe about art is fundamentally questionable. Taste is subjective, but some subjective opinions are casually expressed the same way we articulate principles of math or science. There isn't an ongoing cultural debate over the merits of
: It's not merely an epic novel, but a transformative literary innovation that helps define how novels are supposed to be viewed. Any discussion about the clichÃ©d concept of “the Great American Novel” begins with this book. The work itself is not above criticism, but no individual criticism has any impact; at this point, attacking
only reflects the contrarianism of the critic. We all start from the supposition that
is accepted as self-evidently awesome, including (and perhaps especially) those who disagree with that assertion.
So how did this happen?
in 1851, basing his narrative on the real-life 1839 account of a murderous sperm whale nicknamed “Mocha Dick.” The initial British edition is around nine hundred pages. Melville, a moderately successful author at the time of the novel's release, assumes this book will immediately be seen as a masterwork. This is his premeditated intention throughout the writing process. But the reviews are mixed, and some are contemptuous (“it repels the reader” is the key takeaway from one of the very first reviews in the London
). It sells poorlyâat the time of Melville's death, total sales hover below five thousand copies. The failure ruins Melville's life: He becomes an alcoholic and a poet, and eventually a customs inspector. When he dies destitute in 1891, one has to assume his perspective on
is something along the lines of “Well, I guess that didn't work. Maybe I should have spent fewer pages explaining how to tie complicated
knots.” For the next thirty years, nothing about the reception of this book changes. But then World War I happens, andâsomehow, and for reasons that can't be totally explained
âmodernists living in postwar America start to view literature through a different lens. There is a Melville revival. The concept of what a novel is supposed to accomplish shifts in his direction and amplifies with each passing generation, eventually prompting people (like the 2005 director of Columbia University's American studies program) to classify
as “the most ambitious book ever conceived by an American writer.” Pundits and cranks can disagree with that assertion, but no one cares if they do. Melville's place in history is secure, almost as if he were an explorer or an inventor: When the prehistoric remains of a previously unknown predatory whale were discovered in Peru in 2010, the massive creature was eventually named
. A century after his death, Melville gets his own extinct super-whale named after him, in tribute to a book that commercially tanked. That's an interesting kind of career.
Now, there's certainly a difference between collective, objective wrongness (e.g., misunderstanding gravity for twenty centuries) and collective, subjective wrongness (e.g., not caring about
for seventy-five years). The machinations of the transitions are completely different. Yet both scenarios hint at a practical
reality and a modern problem. The practical reality is that any present-tense version of the world is unstable. What we currently consider to be trueâboth objectively and subjectivelyâis habitually provisional. But the modern problem is that reevaluating what we consider “true” is becoming increasingly difficult. Superficially, it's become easier for any one person to dispute the status quo: Everyone has a viable platform to criticize
(or, I suppose, a mediocre HP printer). If there's a rogue physicist in Winnipeg who doesn't believe in gravity, he can self-publish a book that outlines his argument and potentially attract a larger audience than
found during its first hundred years of existence. But increasing the capacity for the reconsideration of ideas is not the same as actually changing those ideas (or even
them to change by their own momentum).