This post is from our new In-House Critics blog. Click here to read more about it.
For years, much of the political right has claimed that global warming is a scientific hoax perpetrated by statists in order to justify further government control over the economy. I have repeatedly pointed out that this is more or less nonsense, usually to audiences that are far less amenable to this message than the readership of The New Republic, with predictable results. It is certainly true, of course, that there are political actors for whom climate change is a convenient excuse for amassing power, and scientific researchers, bankers, and businesspeople who are just jumping onto a funding gravy train; but this doesn’t mean that the underlying technical risk assessment is invalid.
The political left has its own conspiracy theory on the issue. It was on almost perfect display in Al Gore’s article (“The Crisis Comes Ashore”) in the June 10 TNR. Gore argues that public confidence in the warnings of “looming catastrophe” presented in “the most elaborate and impressive scientific assessment in the history of our civilization” is being undermined by a “cynical and lavishly funded disinformation campaign” paid for by “carbon polluters.” It is certainly true, of course, that some oil companies and other interest groups have funded PR campaigns in pursuit of their narrowly-defined self-interest; but once again, this shouldn’t change our rational evaluation of the environmental impact of greenhouse gas accumulations one way or the other.
Gore agrees in his article that the proper response to this issue is not to be found in the political sound and light show, but in a rational assessment of risks, saying that “rather than relying on visceral responses, we have to draw upon our capacity for reasoning, communicating clearly with one another, forming a global consensus on the basis of science…”. Gore goes on to suggest a technical foundation for this reasoning process:
Over the last 22 years, the Intergovernmental Panel on Climate Change has produced four massive studies warning the world of the looming catastrophe that is being caused by the massive dumping of global-warming pollution into the atmosphere.
So, what does the IPCC actually have to say about what we should expect to happen as a result of our “massive dumping of global-warming pollution into the environment”
According to the IPCC’s currently-governing Fourth Assessment Report, under a reasonable set of assumptions for global economic and population growth (Scenario A1B), the world should expect to warm by about 3°C over roughly the next century (Table SPM.3). Even in the most extreme IPCC marker scenario (A1F1), the best estimate is that we should expect warming of about 4°C over roughly the next century. How bad would that be? Also according to the IPCC (page 17), a global increase in temperature of 4°C should cause the world to have about 1 to 5 percent lower economic output than it would otherwise have. So if we do not take measures to ameliorate global warming, the world should expect sometime in the 22nd century to be about 3 percent poorer than it otherwise would be (though still much richer per capita than today).
Prior to consideration of the more detailed economic issues—e.g., costs versus benefits of attempts to forestall the problem; the danger of worse-than-expected outcomes, etc.—pause to recognize that according to the IPCC the expected economic costs of global warming under the plausible scenarios for future economic growth are likely to be about 3 percent of GDP more than 100 years from now. This is pretty far from the rhetoric of global destruction and Manhattan as an underwater theme park.
But of course, several percent of global GDP is a lot of dough, and avoiding such costs would justify extensive mitigation efforts. What would the conventional proposals, such as a global carbon tax or cap-and trade scheme, cost?
The key costs of such a program would be the loss in consumption we would experience if we used less energy, substituted higher-cost sources of energy for fossil fuels, and paid for “offset” projects to ameliorate the effect of emissions (an example would be planting lots of trees). Resources for the Future, a moderately left-of-center, well-respected environmental organization, collated a set of widely-cited projections for the costs of such emission mitigation schemes for the world as a whole. Its list of “least cost” estimates—i.e., the estimates made assuming implementation of the most efficient possible policy that assumes, for example, global coordination around emissions reductions without any more realistic geopolitical complexity—for the cost of policies designed to limit the rise in atmospheric carbon dioxide to 450 parts per million (ppm) average a little over 6 percent of global GDP by 2100 (with a very wide range of estimates). That is, we would start paying a cost today that would rise to about 6 percent of world output by 2100 in order to only partially avoid a problem that would have expected costs of about 3 percent of world output sometime later than 2100.
This is basically why formal analyses of such schemes normally show such poor cost/benefit ratios. William Nordhaus, who heads the widely respected environmental-economics-modeling group at Yale, estimates (page 84) the total expected net benefit of an optimally designed, implemented, and enforced global program to be equal to the present value of about 0.2 percent of future global economic consumption (and this would not come close to limiting total accumulation to 450 ppm). In the real world of domestic politics and geostrategic competition, it is not realistic to expect that we would ever have an optimally designed, implemented, and enforced global system, and the side deals made to put in place even an imperfect system would likely have costs that would dwarf 0.2 percent of global economic consumption. Look at what was required to not pass the Waxman-Markey cap-and-trade program, in a wealthy and reasonably democratic country, to get some idea of the kind of deals that would have to be cut. And then you’d have to enforce it throughout developing economies for literally centuries.
The expected economic benefits of emissions mitigation do not cover its realistically expected costs.
The most consequential objection to this line of reasoning is that the risk of worse-than-expected damages is so severe that it justifies almost any cost. In this light, we would be buying an insurance policy (metaphorically speaking) by implementing the kind of emissions mitigation programs that Gore and others advocate. Gore, however, doesn’t get to this argument, because he has relied on listing bad things that he says will happen as a result of climate change, and claiming that it’s therefore obvious what we should do. But for the reasons I’ve tried to explain, I think this misses the most important aspect of the issue, which is the depth of our uncertainty.
Paul Krugman, in a very useful article in the New York Times Magazine in April, does put forward the problem of uncertainty in this context, and makes the argument directly and succinctly:
You might think that this uncertainty weakens the case for action, but it actually strengthens it. As Harvard’s Martin Weitzman has argued in several influential papers, if there is a significant chance of utter catastrophe, that chance—rather than what is most likely to happen—should dominate cost-benefit calculations. And utter catastrophe does look like a realistic possibility, even if it is not the most likely outcome.
Weitzman argues—and I agree—that this risk of catastrophe, rather than the details of cost-benefit calculations, makes the most powerful case for strong climate policy. Current projections of global warming in the absence of action are just too close to the kinds of numbers associated with doomsday scenarios. It would be irresponsible—it’s tempting to say criminally irresponsible—not to step back from what could all too easily turn out to be the edge of a cliff.
Krugman is correct, in my view, that: (i) a simple comparison of expected costs to expected benefits over the next century is an inadequate consideration of the economic trade-offs involved, (ii) uncertainty is central to the real decision logic, and (iii) increasing uncertainty in our forecasts strengthens the case for action.
The starting point for such a consideration is to recognize that we are not certain how much CO2 humanity will emit, how much warming a given amount of CO2 will cause, or how much damage a given amount of warming will cause. It is rational to reflect this lack of certainty by handicapping the possible levels of change and damage. Climate and economics modelers aren’t idiots, so it’s not like this hasn’t occurred to them. Competent analysts don’t assume only the most likely case, but build probability distributions for levels of warming and associated economic impacts (e.g., there is a 5 percent chance of 4.5 °C warming, a 10 percent chance of 4.0 °C warming, and so on). The economic calculations that comprise, for example, the analysis by William Nordhaus that I referenced earlier are executed in just this manner. So, the possibility of “worse than expected” impacts really means, more precisely, “worse than our current estimated probability distribution.” That is, we are concerned here with the inherently unquantifiable possibility that our probability distribution itself is wrong.
The stronger form of the argument based upon uncertainty is not only that it is possible that the true probability distribution of potential levels of warming is actually much worse than believed by the IPCC, but that a reasonable observer should accept it as likely that this is the case. As Krugman indicates, the sophisticated version of this argument has been presented by Weitzman. Weitzman’s reasoning on this topic is subtle and technically ingenious. In my view, it is the strongest existing argument for a global regime of emission mitigation. (You can see a slightly earlier version of his paper, and my lengthy response here, along with links to the underlying source documents.) In very short form, Weitzman’s central claim is that the probability distribution of potential losses from global warming is “fat-tailed,” or includes high enough odds of very large amounts of warming (20°C or more) to justify taking expensive action now to avoid these low probability/high severity risks.
The big problem with his argument, of course, is that the IPCC has already developed probability distributions for potential warming that include no measurable probability for warming anywhere near this level for any considered scenario. That is, the best available estimates for these probability distributions are not fat-tailed in the sense that Weitzman means it. Therefore, Weitzman is forced to do his own armchair climate science, and argue (as he does explicitly in his paper) that he has developed a superior probability distribution for expected levels of warming than the ones the world climate-modeling community has developed and published. And his proposed alternative probability distribution is radically more aggressive than anything you will find in any IPCC Assessment Report—Weitzman argues, in effect, that there is a one percent chance of temperature increase greater than 20°C over the next century, while even the scale on the charts that display the relevant IPCC probability distributions only go up to 8°C (Figure SPM.6). It is not credible to accept Weitzman’s armchair climate science in place of the IPCC’s.
The only real argument for rapid, aggressive emissions abatement, then, boils down to the weaker form of the uncertainty argument: that you can’t prove a negative. The problem with using this rationale to justify large economic costs can be illustrated by trying to find a non-arbitrary stopping condition for emissions limitations. Any level of emissions imposes some risk. Unless you advocate confiscating all cars and shutting down every coal-fired power plant on earth literally tomorrow morning, you are accepting some danger of catastrophic warming. You must make some decision about what level of risk is acceptable versus the costs of avoiding this risk. Once we leave the world of odds and handicapping and enter the world of the Precautionary Principle—the Pascal’s Wager-like argument that the downside risks of climate change are so severe that we should bear almost any cost to avoid this risk, no matter how small—there is really no principled stopping point derivable from our understanding of this threat.
Think about this quantitatively for a moment. Suspend disbelief about the real world politics, and assume that we could have a perfectly implemented global carbon tax. If we introduced a tax high enough to keep atmospheric carbon concentration to no more than 420 ppm (assuming we could get the whole world to go along), we would expect, using the Nordhaus analysis as a reference point, to spend about $14 trillion more than the benefits that we would achieve in the expected case. To put that in context, that is on the order of the annual GDP of the United States of America. That’s a heck of an insurance premium for an event so low-probability that it is literally outside of a probability distribution. Gore has a more aggressive proposal that if implemented through an optimal carbon tax (again, assuming we can get the whole word to go along) would cost more like $20 trillion in excess of benefits in the expected case. Of course, this wouldn’t eliminate all uncertainty, and I can find credentialed scientists who say we need to reduce emissions even faster. Without the recognition that the costs we would pay to avoid this risk have some value, we would be chasing an endlessly receding horizon of zero risk.
So then, how should we confront this lack of certainty in our decision logic? At some intuitive level, it is clear that rational doubt about our probability distribution of forecasts for climate change over a century should be greater than our doubt our forecasts for whether we will get very close to 500 heads if we flip a fair quarter 1,000 times. This is true uncertainty, rather than mere risk, and ought to be incorporated into our decision somehow. But if we can’t translate this doubt into an alternative probability distribution that we should accept as our best available estimate, and if we can’t simply accept “whatever it takes” as a rational decision logic for determining emissions limits, then how can we use this intuition to weigh the uncertainty-based fears of climate change damage rationally? The only way I can think of is to attempt to find other risks that we believe present potential unquantifiable dangers that are of intuitively comparable realism and severity to that of outside-of-distribution climate change, and compare our economic expenditure against each.
Unfortunately for humanity, we face many dimly-understood dangers. Weitzman explicitly considers an asteroid impact and bioengineering technology gone haywire. It is straightforward to identify others. A regional nuclear war in central Asia kicking off massive global climate change (in addition to its horrific direct effects), a global pandemic triggered by a modified version of the HIV or Avian Flu virus, or a rogue state weaponizing genetic-engineering technology are all other obvious examples. Any of these could kill hundreds of millions to billions of people.
Consider the comparison of a few of these dangers to that of outside-of-distribution climate change dangers. The consensus scientific estimate is that there is a 1-in-10,000 chance of an asteroid large enough to kill a large fraction of the world’s population impacting the earth in the next 100 years. That is, we face a 0.01% chance of sudden death of a good chunk of people in the world, likely followed by massive climate change on the scale of that which killed off the non-avian dinosaurs. Or consider that Weitzman argues that we can distinguish between unquantifiable extreme climate change risk and unquantifiable dangers from runaway genetic crop modification because “there exists at least some inkling of a prior argument making it fundamentally implausible that Frankenfood artificially selected for traits that humans and desirable will compete with or genetically alter the wild types that nature has selected via Darwinian survival of the fittest.” That does not seem exactly definitive. What is the realism of a limited nuclear war over the next century—with plausible scenarios ranging from Pakistan losing control of its nuclear arsenal and inducing a limited nuclear exchange with India, to a war between a nuclearized Iran and Israel?
The U.S. government currently spends about four million dollars per year on asteroid detection (in spite of an estimate that one billion dollars per year spent on detection plus interdiction would be sufficient to reduce the probability of impact by 90 percent). We continue to exploit genetic engineering to improve crop yields because, much like avoiding burning fossil fuels, the human costs of stopping this would be immediate and substantial. We detonated the atomic bomb at Trinity, and fired up in the Large Hadron Collider, because the perceived benefits anticipated from both were significant. We are not willing to engage in an unlimited level of military action to prevent nuclear proliferation, despite the risks proliferation creates, since we must weigh risk against risk.
In the face of massive uncertainty, hedging your bets and keeping your options open is almost always the right strategy. Money and technology are our raw materials for options. A healthy society is constantly scanning the horizon for threats and developing contingency plans to meet them, but the loss of economic and technological development that would be required to eliminate all theorized climate change risk (or all risk from genetic technologies or, for that matter, all risk from killer asteroids) would cripple our ability to deal with virtually every other foreseeable and unforeseeable risk, not to mention our ability to lead productive and interesting lives in the meantime.
So what should we do about the real danger of global warming? In my view, we should be funding investments in technology that would provide us with response options in the event that we are currently radically underestimating the impacts of global warming. In the event that we discover at some point decades in the future that warming is far worse than currently anticipated, which would you rather have at that point: the marginal reduction in emissions that would have resulted up to that point from any realistic global mitigation program, or having available the product of a decades-long technology project to develop tools to ameliorate the problem as we then understand it?
The best course of action with regard to this specific problem is rationally debatable, but at the level of strategy, we can be confident that humanity will face many difficulties in the upcoming century, as it has in every century. We just don’t know which ones they will be. This implies that the correct grand strategy for meeting them is to maximize total technical capabilities in the context of a market-oriented economy that can integrate highly unstructured information, and, most important, to maintain a democratic political culture that can face facts and respond to threats as they develop.
The ironies in all of this abound.
Al Gore presents himself in this article as a bringer of incontrovertible scientific certainty that we must heed in order to save ourselves. He puts forward as an obvious implication of these scientific findings that we must radically reduce the use of fossil fuels right now, or face an inevitable calamity. He claims that all that stands in the way of this happening is nefarious oil companies manipulating a gullible American electorate into opposing a course of action that is clearly in the public interest.
But Gore’s oversimplifications obscure more than they reveal. In fact, it is the uncertainties in our understanding that are the most compelling driver of rational action. And a massive carbon tax or a cap-and-trade rationing system would likely cost more than the damages it would prevent. Either would be an impractical, panicky reaction that would be both more expensive and less effective than targeted technology development in the event that we ever have to confront the actual danger: the very small but real chance of much worse than expected damages from greenhouse gases.