For most of the 2.5 million years that humans and their predecessors have been around, the Earth has been a volatile place. Subtle shifts in the planet’s orbit have triggered large temperature swings; glaciers have marched across North America and Europe and then retreated. But, about 10,000 years ago, something unusual happened: The Earth’s climate settled into a relatively stable state, global temperatures started hovering within a narrow band, and sea levels stopped rising and falling so drastically. Historians argue that this fortuitous geological period, known as the Holocene, allowed civilization to develop. Modern humans no longer had to pluck whatever they could from an unreliable environment and move on. The relatively predictable climate patterns allowed them to colonize coastal regions, clear forests for agriculture, and raise livestock. “Once you get stability, you can start planning for the future,” explains Sander van der Leeuw, who directs the School of Human Evolution and Social Change at Arizona State University. “You’re no longer acting on the moment but acting with the future in mind. And that’s what our whole society is based on—being able to invest in the future.”
Because of quirks in the Earth’s orbit, the planet would likely remain in this stable state for at least the next 10,000 years if left to its own devices. But the planet isn’t being left to its own devices. Humans are now drastically altering the natural world in so many ways—mowing down forests, depleting freshwater supplies, fiddling with the planet’s thermostat—that some experts have recently begun arguing that we’ve left the Holocene and moved into an entirely new geological epoch of our own making, dubbed the Anthropocene.
For most of the 2.5 million years that humans and their predecessors have been around, the Earth has been a volatile place. Subtle shifts in the planet’s orbit have triggered large temperature swings; glaciers have marched across North America and Europe and then retreated. But, about 10,000 years ago, something unusual happened: The Earth’s climate settled into a relatively stable state, global temperatures started hovering within a narrow band, and sea levels stopped rising and falling so drastically. Historians argue that this fortuitous geological period, known as the Holocene, allowed civilization to develop. Modern humans no longer had to pluck whatever they could from an unreliable environment and move on. The relatively predictable climate patterns allowed them to colonize coastal regions, clear forests for agriculture, and raise livestock. “Once you get stability, you can start planning for the future,” explains Sander van der Leeuw, who directs the School of Human Evolution and Social Change at Arizona State University. “You’re no longer acting on the moment but acting with the future in mind. And that’s what our whole society is based on—being able to invest in the future.”
Because of quirks in the Earth’s orbit, the planet would likely remain in this stable state for at least the next 10,000 years if left to its own devices. But the planet isn’t being left to its own devices. Humans are now drastically altering the natural world in so many ways—mowing down forests, depleting freshwater supplies, fiddling with the planet’s thermostat—that some experts have recently begun arguing that we’ve left the Holocene and moved into an entirely new geological epoch of our own making, dubbed the Anthropocene.
Global warming is the big change that tends to get all the attention. By now, it’s well-established that humans are adding heat-trapping gases to the atmosphere at an unprecedented rate. Climatologists have assembled ample evidence that we run the risk of catastrophe if we add too much carbon dioxide to the air. Yet it may be too narrow to focus on climate change in isolation. In recent years, some scientists have begun casting a wary eye at all the other ways we’re upending the Earth’s natural state, from disrupting the planet’s nitrogen cycle to using up resources to dousing our rivers and oceans with new synthetic chemicals. Those researchers are starting to ask whether these trends, too, could have their own tipping points—whether, as with climate, there are boundaries over which we would be incredibly foolish to cross. And what they have found suggests that, by pushing the Earth outside of the state that has persisted for the last 10,000 years, we risk squandering the stability that made civilization possible in the first place.
The idea that there are “tipping points” in nature has been discussed by scientists since the 1970s. An ecosystem can change slowly and gradually over many years until, suddenly, it reaches a threshold, at which point rapid and potentially irreversible shifts ensue.
One much-studied example is the Sahara Desert. About 6,000 years ago, the vast expanse of the Sahara was filled with lakes, wetlands, and a lush variety of species—ancient cave paintings in the region depict crocodiles. Then, some 5,500 years ago, the region became a massive desert over the relatively short course of about 200 years. “That sort of rapid change just doesn’t make sense,” notes Jonathan Foley, director of the University of Minnesota’s Institute on the Environment. “Unless, that is, you allow for non-linear systems in nature.” One hypothesis is that a slight change in Earth’s orbit led to a gradual decline in solar energy, which meant fewer monsoons hitting the region. The tipping point then came when a prolonged drought killed off the local vegetation that helped supply the area with moisture. At that point, a feedback loop kicked in, and the Sahara started drying of its own accord—more plants died off, which, in turn, made the region even more parched—until the whole area desertified in remarkably short order.
Over the last decade, a number of climate scientists have looked for tipping points that could be triggered by global warming. It’s worrisome enough that a warming world would produce steadily rising sea levels, droughts, and heat waves. But what keeps many researchers up at night is the idea of rapid, non-linear changes that utterly transform the planet. To take one example, as Arctic sea ice begins melting, the dark water underneath starts absorbing a greater share of sunlight, which, in turn, causes the ice to melt even faster. Models suggest that this feedback could reach a point where the sea ice begins melting at an incredibly fast pace and essentially collapses—indeed, some Arctic researchers think we may have already crossed this threshold. Other possible scenarios include the disintegration of the ice sheet on Greenland, rapid die-off of rain forests in the Amazon, or, even more extreme, a shutdown of the ocean currents in the Atlantic that maintain Europe’s livable climate. (Fortunately, that last one, while feasible, doesn’t appear likely anytime soon.)
Scientists continue to argue—sometimes fiercely—over the details and precise timing of these dramatic changes. But the prospect that global warming could produce non-linear disruptions has, understandably, helped inform many climate-policy goals. After all, it would be one thing for human civilization to adapt to sea levels that rose smoothly by, say, one centimeter a year. It would be another to deal with fast and unpredictable rises. That’s why the United Nations has recommended limiting warming to no more than 2°C above pre-industrial levels—above that point, we risk large-scale changes beyond our control. But it’s also true that scientists can’t be certain of where that threshold lies. James Hansen of NASA has famously argued that even the U.N. targets are too rash, and that we should limit atmospheric carbon dioxide to below 350 parts per million (a point we’ve already passed), based on his study of how quickly ice sheets reacted in prehistoric periods when the Earth’s temperature was just 1°C to 2°C warmer than today.
Those discussions, meanwhile, have led some scientists to wonder if there might be other tipping-point thresholds lurking in nature. In 2008, Johan Rockström, the director of the Stockholm Resilience Centre in Sweden, brought together two dozen experts across a variety of disciplines to see if they could identify what they called “the rules of the game.” What are the natural boundaries that humanity should stay within if we want to keep the Earth in the stable state of the last 10,000 years?
In a paper published last fall in the prestigious journal Nature, Rockström and his co-authors created, for the first time, a checklist of all the possible ways the planet is hurtling toward a potentially perilous new course. They identified nine boundaries concerning matters such as climate change, ocean acidification, extinction rates, land and water use, ozone depletion, and the disruption of the Earth’s nitrogen and phosphorus cycles. But what was most astounding was just how little we still know about the natural experiment our civilization is unwittingly conducting.
Take, for instance, ocean acidification. Ocean chemists have long known that, as carbon-dioxide concentrations in the atmosphere rise, more and more of the gas gets absorbed by the ocean and lowers its pH level—in essence, making the waters more acidic. But, up until ten years ago, no one had seriously modeled this effect, and it was assumed to be negligible. “It hadn’t been on anyone’s radar screens,” explains Ellen Thomas, a Yale paleoceanographer. “The thinking had been, ‘Well, the oceans are so huge, what can a little extra CO2 in the air really do?’” Then, in 1999, a climate modeler at Stanford named Ken Caldeira was asked by the Energy Department to study the effects of capturing carbon dioxide from smokestacks and injecting it deep into the sea as a way of cutting greenhouse-gas emissions. Caldeira studied the change in ocean pH and then, just for reference, compared it to what would happen if man-made CO2 levels kept rising. The latter proved shocking. Caldeira discovered that the oceans were already on pace to become more acidic than at any time in the past 300 million years. He published his results in 2003, coining the term “ocean acidification.”
In the years since, scientists have realized that ocean acidification is, in fact, a huge deal—a problem as worrisome as many of the worst effects of global warming. The lower pH in the oceans will prevent marine organisms from building calcium-carbonate shells. That, in turn, has the potential to weaken or even wipe out many coral reef systems—and the millions of species that depend on them. And that could prove devastating. In Asia alone, some one billion people rely on reef fisheries for food or income. Local economies that rely on oysters, clams, or scallops could go bust. Areas like Florida that depend on reefs as a bulwark against storms would find themselves more vulnerable to hurricane damage. And, while marine scientists still can’t pin down the exact point at which a global calamity would ensue, lessons from the past are unnerving. During the last major wave of ocean acidification, 55 million years ago, a vast array of deep-sea species were wiped out—which, in turn, upended marine ecosystems around the world. Given how heavily we rely on the ocean’s current food web, that’s a prospect we can’t dismiss lightly.
Rockström’s group also identified a number of potential tipping points related to modern-day industrial agriculture. With nearly seven billion people on the planet and a growing demand for meat, farmers are razing forests and diverting freshwater for irrigation at a stunning pace. It’s not hard to see that we could soon reach the point where irreversible degradation becomes likely in many areas. The Aral Sea in Central Asia, once the world’s fourth-largest lake, has already become so drained that it is now largely desert, causing the area’s once-robust fishing economy to implode and leaving trawlers marooned on the sands. And Foley points out that, on some islands in Indonesia, destruction of rain forests has reached the point where local weather patterns have been altered, drying out the area and, in turn, making the remaining trees more susceptible to forest fires. In Borneo, the black smoke has grown so thick that it has interfered with air and sea traffic and is causing widespread respiratory illnesses.
Another unsustainable trend the group pinpointed involved our reliance on synthetic fertilizer. Ever since the early twentieth century, when German chemist Fritz Haber devised a method for capturing nitrogen from the air to make ammonia gas, humans have depended on artificial nitrogen fertilizer to boost crop productivity. It was a world-changing invention, enabling the planet to keep feeding itself even as the population ballooned. But the practice has also inflicted heavy damage on the water and soil we depend on. The vast bulk of the nitrogen and phosphorus used in artificial fertilizers makes its way into the rivers and oceans. In areas like the Gulf of Mexico and the Baltic Sea, the excess nitrogen has overstimulated the growth of algae, which chokes off the water’s oxygen and kills most other organisms in the vicinity. These vast floating “dead zones” are popping up around the world—areas the size of New Jersey where fish can’t survive. Beyond that, excess nitrogen has been found to deplete organic carbon in the soil and decimate plant species in certain areas. In essence, we have overwhelmed the Earth’s nutrient cycle—Rockström’s group suggested that global nitrogen use was more than four times the “safe” threshold.
Granted, some experts find the idea of a single boundary too simplistic. Stuart Pimm, a professor of conservation ecology at Duke, points to the paper’s discussion of biodiversity as an example. Throughout history, the normal “background” rate for extinction has been about one in a million species per year. In the modern era, thanks to activities like deforestation and overfishing, an average of about 100 species per million die off each year—a staggering rate that has the potential to collapse ecosystems. In response, Rockström’s group proposed a threshold limit of no more than ten species per million per year. “But that number was completely arbitrary,” Pimm says. What’s more, the idea of a single worldwide limit on extinction can obscure some important nuances. Since the 1960s, scientists have studied “keystone” species that prop up an entire ecosystem—if, for instance, starfish are removed from a bay, the local mussel population can explode and drive out other species. “Some ecosystems can lose a number of species and be OK,” Pimm argues. “Others can lose just a few species and they’re manifestly not OK.”
Still, the specific numerical thresholds proposed by Rockström’s group were, in some sense, a side issue. Experts will continue to haggle over whether, say, 350 ppm or 450 ppm is a “safe” limit for carbon dioxide in the air. But what was more notable—indeed, eerie—about the Nature paper was its emphasis on how little we can predict about the many dramatic changes underway. Consider the mounting evidence that aerosol pollution—such as sulfates—can alter local rainfall patterns. But, according to Kevin Noone, a colleague of Rockström’s at the Stockholm Resilience Centre, there simply aren’t enough data yet to be able to settle on a “safe” level of aerosol pollution that would avoid large-scale disruption of, say, the Asian monsoon cycle. Or take chemical pollution. Currently, there are more than 80,000 synthetic chemicals on the global market, and we know that many of them can harm human health or disturb the reproductive cycles of certain species (in much the same way DDT caused eggshell thinning among birds of prey). But no one knows whether there’s a point at which some combination of chemical pollutants could throw large-scale ecosystems out of whack. “There’s evidence out there that these artificial chemicals can have adverse effects, sure, but no one’s really looked at them yet from the perspective of global thresholds,” says Will Steffen, a climatologist at the Australian National University who contributed to the Nature study. “That was one where we couldn’t even guess a boundary.”
Scarier still, many of these different processes can interact in unforeseen ways. The heavy use of synthetic nitrogen fertilizers can cause soil to release nitrous oxide, a greenhouse gas. Overfishing can make coral reefs less resilient in the face of stresses like ocean acidification and warmer temperatures. Global warming can speed up the rate of extinction. “Once you have all of these dynamic processes in so many dimensions interacting with each other, we no longer have the capability to anticipate what will happen,” says van der Leeuw. “That then raises the possibility that these processes will start affecting each other in different ways that can quickly have cascade effects we can’t see right now.”
And that creates a dilemma: Right now, too many environmental problems are studied in isolation. “Ever since Newton, the tendency of science has been to slice the pie up smaller and smaller,” says Noone. As an illustrative example, he notes that scientific literature now boasts a Journal of Chemical Physics, a Journal of Physical Chemistry, as well as the journal Physical Chemistry Chemical Physics.
Yet the research into planetary boundaries suggests that it may be impossible to deal with just one or two environmental issues at a time. At the moment, global warming gets the bulk of attention. But treating it as the only problem could lead to misguided solutions. For instance, in recent years, some researchers have broached the idea of geoengineering as a cure for climate change—one idea is that we would reflect some of the Earth’s sunlight in order to cool the planet down. If rising temperatures were our sole concern, that might not be a bad idea. But, of course, that solution would do nothing about the carbon-dioxide emissions that are acidifying the world’s oceans, or the frenetic pace of deforestation that could alter the Earth’s landscape irreparably.
The Nature study, with its many visions of destruction, is enough to cause one to crawl into a cave and die. But the notion of planetary boundaries is also very different from past doomsaying. Inherent in the concept, in fact, is cause for optimism. Back in the 1970s, the Club of Rome think tank commissioned an influential book titled The Limits to Growth, which predicted that the combination of a rapidly swelling world population and finite natural resources would put severe limits on economic growth. As it turned out, many of the book’s ominous predictions were mistaken—the authors had underestimated the ability of both markets and technology to overcome resource constraints.
Similarly, an important lesson from recent history proves that it’s possible for humanity to stay safely within the Earth’s natural boundaries and still thrive. One threshold that Rockström’s group identified involved the stratospheric ozone layer, which shields the planet from harmful ultraviolet (UV) rays. As it turns out, this is the one known planetary boundary that we’re no longer in danger of crossing—but the story of how that came to pass is illustrative.
In the 1970s, scientists were puzzling over the fact that readings of ozone concentration in the Antarctic region were much lower than anyone had expected. (The readings were so odd that, for nearly a decade, atmospheric scientists dismissed the data as due to instrument error.) At around the same time, scientists were discovering that concentrations of chlorofluorocarbons (CFCs)—chemicals used in refrigerators and air conditioners—were lingering in the atmosphere at high rates. In 1974, chemists Mario Molina and Sherwood Rowland published a paper arguing that the two phenomena may be linked, although it was a difficult hypothesis to prove. DuPont, the world’s biggest manufacturer of CFCs, disputed the connection, and it wasn’t until the mid-’80s that hard evidence confirming the link came in—including the discovery of a gaping ozone hole over Antarctica.
Looking back, it’s remarkable to ponder the serendipity of these discoveries—and how little margin for error we had. As luck would have it, DuPont had been using chlorine instead of bromine to produce CFCs. As far as anyone could tell, the two elements were interchangeable. But, as another prescient ozone researcher, Paul Crutzen, later noted, bromine is 45 times as effective at destroying ozone as chlorine. Had DuPont chosen to use bromine, the ozone hole could well have spanned the globe by the 1970s instead of being largely confined to Antarctica—long before anyone had a glimmering of the problem. It’s not hard to see what massive worldwide ozone depletion would’ve meant. Punta Arenas, the southernmost town of Chile, sits under the Antarctic ozone hole, and skin cancer rates there have soared by 66 percent since 1994. If humans had destroyed stratospheric ozone across the globe, we would likely be unable to set foot outdoors without layers of sunscreen and dark shades to prevent eye damage. Worse, the excess UV rays could have killed off many of the single-celled organisms that form the basis for the ocean’s food chain and disrupted global agriculture (studies show that bean and pea crop yields decline about 1 percent for every percent increase in UV exposure).
Happily, though, scientists did discover the ozone hole. And, despite industry warnings that abolishing CFCs would impose unbearable costs, world leaders agreed to phase out the chemicals in 1987, and economic ruin never arrived. DuPont developed a substitute for CFCs, and ozone levels in the atmosphere have stabilized, with the hole over Antarctica expected to heal by 2050. A topic that once graced the cover of Time and generated heated congressional debates now barely gets mentioned. We learned to stay within one planetary boundary without impeding human prosperity. That should give us every reason to think we can respect the others we are now barreling past.
Bradford Plumer is an assistant editor of The New Republic.