Christina Larson is a contributing editor of Washington Monthly.

To adapt or not to adapt? It used to be an impolite question to ask. Until recently, discussing the notion of "adaptation" to  global warming was anathema to most mainstream environmentalists. Folks like Al Gore largely banished the thought, not wanting to distract from efforts to curb greenhouse-gas emissions. Only unbelievers, it was assumed, flirted with the idea of appeasement to the reality of a warming earth. (In 1992’s Earth in the Balance, Gore worried that talk of adaptation might be misconstrued as a “soothing message of reassurance,” perhaps reinforcing the “terrible moral consequences ... of delay.”)

Recently, though, the notion of helping those on the front lines of climate change—often poor folks living at the margins, near receding coastlines or expanding deserts—has been gaining steam. Some amount of future warming, and with it rising sea levels and longer stretches of sand dunes, now seems inevitable, scientists conclude, even if we managed to zero out emissions tomorrow (thanks to lags in the climate system). In 2007, the Rockefeller Foundation established a $70 million fund for "climate change resilience," to help developing countries cope with any future warming. Last winter, the concept of a national “adaptation fund,” to be administered by the EPA, made the rounds on Capitol Hill. In November, Governor Arnold Schwarzenegger issued an executive order for California state agencies to develop a “plan for sea level rise and climate impacts.”

Today’s advocates for adaptation (as a complement to other climate priorities) include an interesting mix of international NGOs, political pragmatists, and insurance companies concerned that their investments not be flooded—or scorched. But all is not kumbaya. Last month during climate negotiations at Poznan, the U.N.’s year-old climate adaptation fund received significantly more attention, more kudos, and also more hand-wringing over how it would actually be funded than when it was quietly created a year ago at Bali. The conference also highlighted other concerns about the fund: How should it be managed? Should allocations be determined by rich donor nations, or poor recipient ones? (One twist is that most carbon-dioxide “mitigation” efforts steer attention and funds primarily toward heavy polluters in developed nations, while adaptation campaigns focus more on developing nations and vulnerable communities.)

One impassioned plea for a greater spotlight on adaptation ran last month as an op-ed in India’s Business Standard, authored by a member of the prime minister’s advisory council for climate change: “So it is time to move on to the next series of questions: What impact is caused, who will suffer, and who should pay for this? How do we share the burden of suffering, reduce vulnerabilities and innovate on ways to deal with adaptation … The calls for climate justice are getting louder at each meeting. But climate justice is not just about equal access to global environmental space, it is also about compensating the victims.”

“Compensating the victims” is a far cry from how adaptation used to be construed.  Michael MacCracken, a prominent climate scientist now at the Climate Institute in Washington DC, was executive director of the U.S. Global Change Research Program from 1997 to 2001. At that time, he remembers, researchers most often discussed impacts and adaptation as a way to “scare people into mitigation.” He explains: “Early on, people were trying to focus on mitigation as if that was the whole problem. Scientists thought, all you have to do is show people that CO2 is going up, and they’ll understand that’s a problem and do something about it.” Of course, it hasn’t been that simple. The upshot is that during the 1990s and early 2000s “there wasn’t a lot of research on figuring out how to adapt.”
That has changed. Last fall, Brookings hosted a roundtable to highlight current adaptation research and funding questions, with Gore and Madeleine Albright dropping by. One grim realization: One grim realization: Some things we can adapt to (changes in agricultural growing seasons); some things we can't (some coastal territory will inevitably be lost, and likely some species).

Where many think the discussion is headed next is the elaboration of something like selective adaptation management: scientists and policymakers finding ways to measure which battles can be won, and then which are most worth winning. Worth, though, is tricky to assign. Not only because it’s a riddle to weigh islands against igloos in different corners of the world—think of all the moral questions involved in deciding to focus greater international efforts on some places, and not others—but because economists have to devise models to compare bills that are expected to come due at different points in time. (Matthew Yglesias has a nice, concise explanation of how discount rates are used to adjust the “costs” of implementing adaptation measures at various future moments.) One sticking point is that different models yield varying results based on how relative values are assigned to today versus tomorrow.

The challenge may be even more fundamental. Models generally posit that at some future moment most problems can be fixed; it’s just that the costs may be greater or lesser, depending on when the bills are paid. But what if some items turn out to be off shelves entirely by the time you decide to allocate money—a limited-edition community, coastline, culture or species? Most cost-benefit models, says MacCracken, “assume there will be something to discount from that can be brought back in the future. But what if there isn’t?” At times our options may be stark: Build the sea wall soon, or forget it. If some things will inevitably be written off, how will we choose?

--Christina Larson