The Half-Life of Facts (19 page)

Read The Half-Life of Facts Online

Authors: Samuel Arbesman

BOOK: The Half-Life of Facts
5.03Mb size Format: txt, pdf, ePub

But understanding these steady progressions is the key to understanding the rapid shifts in our knowledge. These steady progressions are the underlying slow temperature changes that result in the fast phase transition that we see when looking at everything from a different scale. Of course, it’s not always that easy. Figuring out the right underlying change to measure requires a bit of ingenuity. But doing this can help explain the many rapid shifts in our knowledge that are all around us.

.   .   .

EVER
since the first planet was discovered outside our solar system in 1995, anticipated by Carl Sagan in
Cosmos
, we have actually been waiting for another large-scale shift in our awareness of our place in the universe. We have been searching for a planet that is truly like Earth, or one that could at least potentially be habitable. As one of the leaders in the field told me, this is ultimately what many of the astronomers in the area of extrasolar planetary research are after. Of course, there are many other topics to study, such as the true distribution of the types and sizes of planets that exist in our stellar neighborhood. But a paramount goal has been to find a planet truly similar to our own.

Finding one that actually harbors life will be a huge milestone for humanity, and it will be one of those massive phase transitions in how we view the world around us.

But just like landing on the moon, a path to such a discovery proceeds by increments. While we are fairly certain that life can exist in far stranger spaces than we can possibly imagine, good initial spaces to look are planets that are similar to Earth. And short of being able to examine the atmospheres of these planets for such hallmarks of life as oxygen, we use simpler proxies. These proxies take the form of mass and temperature. If a planet can have something like liquid water due to its surface temperature, and is about the same size as Earth (meaning our style of life could have evolved there), this planet is deemed potentially habitable, at least as a first pass.

So while detecting the first potentially habitable planet outside our solar system is a stunningly discontinuous jump in our knowledge, can it be predicted? This is what Gregory Laughlin, an astronomer at the University of California, Santa Cruz, and I tried to do in the summer of 2010. Greg is an expert on extrasolar planets, has discovered numerous ones himself, and writes the premier blog for the field,
Systemic
at oklo.org.

Greg and I knew that the discovery of a roughly Earth-like
planet was on the horizon. NASA’s Kepler mission had been running for more than a year and, in fact, in mid–June 2010 had released a whole slew of exoplanet candidates, tantalizingly withholding the most promising candidates until the following February. We knew that we were in a special window of time, both for the discovery itself and for making a prediction.

We created a simple metric of habitability. Each previously discovered planet was rated on a scale from 0 to 1, where 0 is not habitable at all and 1 is quite like Earth, at least when it comes to mass and temperature. Most planets are 0 or very close to 0, since most planets are either scorchingly hot or unbelievably cold (and sometimes both at different parts of their year). But when we looked at the highest habitability values over time, we saw a clear upward trend: By focusing on the most Earth-like planet discovered each year and charting their habitability values, we found that these have been steadily proceeding upward over time. In fact, this upward trend conformed to our old friend the logistic curve. Just as scientific output in general fits exponential and logistic curves, the march toward the discovery of a potentially Earth-like planet fits one of these omnipresent functions as well.

Since it conformed to a mathematical shape, we could predict when these values would reach something that was habitable by extrapolating these curves into the future. Essentially, the properties of the discovered planets and their values of habitability act as the underlying temperature of the system. As time proceeds, the highest habitability values of discovered planets steadily increase. When we reach a very high habitable value, we jump in our knowledge: We go from not knowing of any planets that are potentially like Earth to knowing of one. Extrasolar planet discovery data allow us to see the microscopic behavior, the small underlying changes in what we know. And checking the news for the discovery of an Earth-like planet provides a metric for whether a phase transition in our knowledge has actually occurred.

After running such an analysis ten thousand times (in order to make sure our calculations were robust), we found that there was a two-thirds chance that an Earth-like planet would be discovered
by the end of 2013. But even more exciting, there was a 50 percent chance of such a discovery happening by early to mid-2011!

Soon after our paper was accepted, on September 1, 2010, we published it online. Four weeks later, on September 29, 2010, a team from UC Santa Cruz and the Carnegie Institution for Science announced the discovery of the first planet truly similar to Earth—one that could actually sustain life—called Gliese 581g.

This planet has some curious properties: For example, one side always faces its star and one side always faces toward the night sky. Due to this, only in the region of the planet in permanent dusk is the temperature potentially habitable. And barely; it rates a balmy Moscow winter. But in addition to having a dimmer red sun than our own (much like the sun of Superman’s home planet of Krypton), it has a most intriguing property: It might not exist.

It turns out that there is debate over whether its signature is authentic or simply a whole lot of really exciting noise. When another team examined a subset of the data, they found no evidence of Gliese 581g. Now it might just be such a subtle signal that it requires a lot of data to detect its presence. It might also not be there. However, since then, a somewhat potentially Earth-like planet, Kepler 22b, has been discovered orbiting a star about six hundred light-years away.

I have heard an astronomer involved in exoplanet discovery mention that nothing would make him happier than to become a biologist once the target planet has been discovered. And it seems that we now have crossed a certain threshold in our scientific knowledge. Where there were once only hints of such a planet, gleaned from bits of data and inferences, we now seem to be at the dawn of a new type of awareness. And this phase transition occurred due to the regular underlying variation in habitability, which was deduced from discoveries of the masses and temperatures of planets.

.   .   .

SOMETIMES
, though, it’s not that easy to determine the underlying change and see when a phase transition will happen. Even so, there are still mathematical regularities to how sudden changes in our
knowledge occur, at least in the aggregate. One of these areas is in the proof of mathematical concepts.

Back at the end of 2010 I attempted to use mathematical modeling to determine when a famous unsolved problem in mathematics would be proved. This was a lot harder than the planetary discovery area. There wasn’t some fundamental underlying property that was slowly changing, so that when all the individual increments were added up we would get something qualitatively new, like a planet that is potentially like Earth. Instead, when it comes to proofs, it often seems that it’s just a lot of mathematicians being wrong, year after year, with little hope of finding the correct solution.

In fact, it’s not quite like that. Even in failure there can be success when trying to solve something, although it’s not exactly what we might have hoped for. For example, Fermat’s Last Theorem was a famously long-unsolved problem. This idea was created by the Frenchman Pierre de Fermat, a lawyer by profession and mathematician by hobby, in 1637. Fermat wrote that there are no three positive numbers
a
,
b
, or
c
that can satisfy the equation
a
n
+ b
n
= c
n
, if
n
is greater than 2. If
n
is 2, we get the Pythagorean Theorem, which has tons of solutions. But make
n
larger, and Fermat stated that no number would work in the equation. Fermat didn’t prove this. Instead, in one of the most maddening episodes in math history, he scribbled this idea in the margins of a book and wrote that he had a brilliant proof, but, alas, the margin was too small to contain it.

We now think he might have been mistaken. But no one had found any numbers greater than 2 that fit the equation since he wrote this statement. So it was assumed to be true, but no one could prove it. This elegant problem in number theory had gone unproven since the seventeenth century, until Andrew Wiles completed a proof in 1995, using pages and pages of very complex math, which would most certainly not have fit in Fermat’s margin. But, crucially, along the way, mathematicians proved other, smaller, proofs in their quest to crack Fermat’s Last Theorem. When finally solved, whole new pieces of math were involved in the construction of the proof.

I decided to tackle predicting the proof of one of the most famous unsolved problems, something known as P versus NP. It’s so tough and important that the Clay Mathematics Institute has offered a prize of $1 million for its solution. It essentially asks whether two classes of mathematical problems are equivalent. P problems are easy, and can be solved in a straightforward fashion by today’s computers, and sometimes even by pencil and paper. NP problems are problems in which we can check whether an answer is correct very easily, independent of whether they seem easy to solve. Specifically, some NP problems appear to be incredibly difficult, but vexingly, if I were to give you the right answer to any NP problem, you would know right away whether it was a correct solution.

For example, finding the prime factors of a number is an NP problem. Given a large enough number (like something that’s about four hundred digits long), we currently think that we would have to use all the computational resources on the planet to try to factor it. And even then it would take far longer than the lifetime of the Earth to determine what two prime numbers were multiplied to get the four-hundred-digit number.

But if I gave you the correct factors, you could simply multiply them together and see if it yielded the really big solution. The question that people have been trying to solve for the past few decades is whether NP problems—which are easy to check, such as factoring primes—are also easy to solve. Thus far, no one has found any shortcuts to solving many fiendishly difficult NP problems, and they seem to be separate from P problems, making us think that P does not equal NP. But we could just be missing some basic insight or mathematical trick that would make solving these problems much easier, and which would imply that P equals NP. For example, trying to see if a big number is divisible by three might be hard, unless you know the trick: If the sum of a number’s digits is divisible by three, then so is the number itself.

Assuming that NP problems are very hard is actually the basic
assumption behind modern cryptography. Codes are only strong so long as P emphatically does not equal NP. And many other things remain hard to solve if P is not equal to NP, from handling resources and logistics in large organizations, to managing complex financial portfolios, to dealing with games and puzzles such as crosswords and Minesweeper. But we don’t know if they’re equal or not. And we haven’t known for over forty years.

When will we know? Possibly never. There are some problems that can never be solved, and in fact have been proven unsolvable. But there are many problems that have gone unsolved for hundreds of years, and then one day a mathematician puts the finishing touches on the proof. Therein lies the key for tackling this problem. I realized that I could look at the distribution of the amount of time it has taken for lots and lots of well-known unsolved problems to be solved eventually and compare that to the amount of time that P versus NP has been an outstanding problem. If P versus NP is not special but is similar to many other famous problems that have been hard to decipher, perhaps then I could gain some insight into looking statistically at how long it takes for other problems to eventually be solved.

So while I can’t know exactly when any individual problem will be solved in advance, I can look at the aggregate of hard problems and see if there are any regularities. In doing this we can begin to put some bounds on our uncertainty about solving hard problems.

It turns out that if we do this, we get some very interesting distributions, known as heavy-tailed distributions. This means that there is no single amount of time we can expect to wait for a solution: Some famous problems go decades before being solved, and some, those that exist far out in the tail of the distribution, remain outstanding for hundreds of years. There was even a famous conjecture (or unproven statement) in the data set that took more than fifteen hundred years before it was eventually proven.

But using this distribution, we can see the number of years it takes for half of all of these problems to be solved, taking that 50 percent
mark and using it as a likely timeframe for its solution. So given that it is soluble, it turns out we will have to wait until 2024, when P versus NP turns fifty-three, for a solution, assuming it behaves like other long-unsolved problems.

There’s still a lot of uncertainty in this year. But through probability, we can now understand how rapid changes in mathematical knowledge can occur, at least in the aggregate. Given that we eventually do discover whether P equals NP, we can be certain that this change in knowledge will occur rapidly, descending upon the greater world without much warning.

Mathematics—from Ising models to probability—can help us to understand how rapid changes in the facts we know can occur around us. But are these phase transitions the rule or the exception? What should we expect more of: slow and steady changes in knowledge or extremely rapid shifts in the facts around us?

While there will no doubt always be slow change in knowledge, many of us have an intuitive sense that facts are changing around us faster and faster, with rapid transitions occurring more often every day. There is scientific evidence to buttress this intuition. To understand this we have to understand how cities produce innovation.

Other books

Death at a Premium by Valerie Wolzien
Lullaby of Murder by Dorothy Salisbury Davis
Pearl in a Cage by Joy Dettman
Steamed 3 (Steamed #3) by Nella Tyler
Generation V by M. L. Brennan
Devil's Valley by André Brink
La caverna by José Saramago
Tilt by Alan Cumyn
The Peddler by Prather, Richard S