You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

The Usefulness of Cranks

Nature as a standpoint for social criticism.

Paradise Found: Nature in America at the Time of Discovery

By Steve Nicholls

(University of Chicago Press, 524 pp., $30)

American Earth: Environmental Writing Since Thoreau

Edited by Bill McKibben

(Library of America, 1,047 pp., $40)

Defending The Master Race: Conservation, Eugenics, And The Legacy Of Madison Grant

By Jonathan Peter Spiro

(University of Vermont Press, 462 pp., $39.95)

A Passion for Nature: The Life of John Muir

By Donald Worster

(Oxford University Press, 535 pp., $34.99)

A Reenchanted World: The Quest for A New Kinship With Nature

By James William Gibson

(Metropolitan Books, 306 pp., $27)

Eco Barons: The Dreamers, Schemers, and Millionaires Who Are Saving Our Planet

By Edward Humes

(Ecco Books, 367 pp., $25.99)

I.

In contemporary public discourse, concern for "the environment" is a mile wide and an inch deep. Even free-market fundamentalists strain to display their ecological credentials, while corporations that sell fossil fuels genuflect at the altar of sustainability. Everyone has discovered how nice it is to be green. Will popular sentiment translate into public policy? There is reason to be skeptical.

After all, we have been here before. The pragmatic, ethical, and aesthetic arguments for conservation are roughly the same as they were in the 1970s--the only difference being that they have acquired even more urgency in the face of depleted oil reserves, fished-out oceans, and "climate change," the current euphemism for global warming. Yet contemporary politicians and pundits treat green concerns as if they were fresh discoveries. Their amnesia is an understandable response to recent history. For the last thirty years--despite the absorption of environmentalist slogans and sentiments into our popular culture, the frequent legal skirmishes on behalf of endangered species, and the spread of serious ecological thought into many academic disciplines--broad environmental concerns all but disappeared from mainstream political debate.

Noble green intentions left little impact on everyday life. Quite the contrary: for most Americans it was as if the 1970s--the decade of the "energy crisis," Small Is Beautiful, and presidential commitments to solar energy--never happened. Who could be bothered with worry about waste amid acres of wired McMansions and herds of lumbering SUVs?

It will take historians many years to sort through the political, economic, and cultural wreckage left by Ronald Reagan and his ideological heirs. But the disappearance of ecological issues from the national agenda was an essential part of the devastation. Environmentalism was one of Reagan’s targets from the beginning. During the campaign, he and his handlers shrewdly exploited Jimmy Carter’s "malaise" speech of July 1979. Carter never used the offending word, though he did refer to an American "crisis of confidence," arising from the discovery that "owning things and consuming things does not satisfy our longing for meaning." Carter’s big mistake was to question this accumulationist ethos in arguing for conservation as a "moral equivalent of war" and committing the government to long-term research into alternative energy sources. It was an extraordinarily prescient speech, one that acknowledged the limits to economic growth and anticipated nearly all the environmental themes that have only recently returned to fashion.

But it was a political disaster for Carter. Polls indicated that popular reaction to the speech was generally favorable, but then the chattering classes weighed in.
By 1979 Carter’s media stock had bottomed out. Pundit after pundit took Carter to task for having the temerity to blame the American people for their wasteful ways. Reagan, meanwhile, was prepared to argue that any talk about limits was un-American. This was the country where the sky was the limit. "America is back," he announced after his election. He lost no time in removing the solar panels from the White House roof. Three decades of denial were under way.

The story of Carter’s speech is a cautionary tale for environmentalists. It suggests the ease with which environmentalists could be identified as puritanical moralists, dour pessimists, enemies of fun and the future. Carter’s public persona reinforced this connection--his sober homiletical tone, his sloping shoulders, his overall limpness. Reagan tilted his head with practiced spontaneity, smiled his lemon-twist smile, and dispensed upbeat aphorisms as if they were freshly minted. His shoulders were padded and his posture was perfect. He was superbly suited to exorcise the demons of doubt, even when doubt had a strong foundation in reality.

And Reagan was not the only villain of this tale. The denial of environmental concerns was part of a broad cultural shift that also swept up the postmodern left. During the 1980s and 1990s, leftists were as likely as rightists to scold environmentalists for their allegedly puritanical preoccupation with limits--as Julian Simon did (from the right) in The Ultimate Resource in 1981 and Andrew Ross did (from the left) in Strange Weather in 1991. Ultimately this critique pushed beyond ethics to epistemology. At its most inane, the postmodern project challenged the very notion that something called "nature" existed apart from human constructions of it.
By the late 1980s, no self-respecting professor in the humanities would use the word "nature," or even the word "reality," without inverted commas. The literary theorist Fredric Jameson revealed the social origins of this style when he announced that "postmodernism is what you have when the modernization process is complete and nature is gone for good"--a view that could be held only by an upper-class professional who spent most of his waking hours insulated from the natural world.

The bizarre notion that "nature is gone for good" had unintended political consequences. It neatly complemented the Republican right’s equation of environmentalism with sentimental nostalgia. The postmodern Left and the reactionary Right deployed different idioms in a common celebration of the untrammeled individual. There was no room at this party for environmentalist killjoys.

But now, as we are constantly reminded, the party’s over. The collapse of credit markets has produced a lot of loose talk about a return to fundamental values, to scrimping and saving and living within our means. But how these ideas and emotions will affect environmental policy or everyday practice remains to be seen. Decades of doctrinaire optimism, uniting everyone from Marxian social critics to development economists to free-market fundamentalists, have undermined the notion that there are natural limits to economic growth. Technological innovation, we have constantly been told, has rendered Malthusian scenarios obsolete.

But these assertions are based on flawed assumptions. Whatever its potential, green technology will take years, perhaps decades, to implement. Meanwhile the urgent need to restore high levels of economic growth by any means necessary--including renewed production of greenhouse gases and consumption of fossil fuels--will trump environmental protection. And the preoccupation with short-term priorities may well promote long-term disaster. A consensus of statistical estimates projects that global population could rise as much as 50 percent over the next forty years. This will probably intensify international competition for scarce resources as the world gets hotter. The techno-utopianism of the last thirty years will prove to have been a flash in the pan, and Malthus may yet have the last word: we shall rediscover that there are natural limits to growth after all.

This need to recognize limits and loss is what makes serious thinking about ecology so difficult for Americans. With respect to our secular religion of progress, ecological thought is deeply sacrilegious. By insisting on the interdependence of the human and non-human worlds, it challenges the sacred tenets of commodity civilization--the drive toward mastery of nature, the obsession with economic productivity, the deep utilitarianism that dismisses spiritual and aesthetic concerns as mere decoration and defines human advance as more things for more people. Consider how often "housing starts" are listed as an index of economic health, and how directly this reveals the American version of creative destruction: not renovating housing stock but making a mess and moving on, gobbling up land as we go.

The problem, from the ecological perspective, is the commitment to the utilitarian definition of progress that lies at the core of both capitalism and socialism--at the heart of modernity itself. To avoid sinking into a porridge of bland pieties, therefore, ecological thinking needs to preserve its anti-modern edge. This will make it a harder sell. Anti-modernism, which has lots of meanings, annoys lots of people. Writing in these pages not too long ago, Ted Nordhaus and Michael Shellenberger railed against "the revulsion at modern life" that they believe is crippling the environmental movement. In their view, anti-modernism is an effete affectation, cultivated by first-world elites who can afford to reject "the modern project of expanding prosperity" because they are already its beneficiaries. From this perspective, only those who have willingly traded their office jobs for "hard agricultural labor" have the right to question the beneficence of modernity.

The argument is bizarre, but it is not unprecedented. Apologists for economic growth have often dismissed its critics as nostalgic elitists unable to stomach the people’s taste for shopping malls and suburban sprawl. By assuming, quite falsely, that these developments simply express consumer desire, real estate developers and corporate technocrats can pass themselves off as advocates for ordinary folk. Nordhaus and Shellenberger embody the quintessentially American desire to equate a particular construction of modernity with the good of democracy, and to celebrate their union in this best of all possible nations.

But this modernizing impulse has always co-existed with another American urge, an indigenous American wisdom. The nation on the cutting edge of modernity has also been "nature’s nation," as Perry Miller called it decades ago--an Eden of astonishing fecundity, a land of fresh starts and fair hopes. Edenic visions were rooted in empirical observation. Historians have compiled a rich dossier of testimony from the early European explorers of North America, who were astounded by the unbounded natural plenitude of this New World and convinced that they had discovered the earthly paradise. The problem was that this new Eden posed as many temptations as the old one had done.

II.

In Paradise Found, Steve Nicholls has returned to this trove of evidence from a different perspective. An entomologist turned documentary film-maker, he wants to tell a cautionary tale about the squandering of untold resources, as well as to suggest some possibilities for replenishing them. He has produced a lumbering bear of a book, lurching from sixteenth-century narratives of discovery to contemporary reflections on wildlife management. It is marred by many grammatical errors, some of them egregious and risible. Here, for example, is Nicholls’s description of a colonial foray into wildlife management: "the sturgeon caught had to be reported to the governor of the colony, or face the loss of both ears." Despite such jarring moments, though, the bumpy ride is worth the trouble. Nicholls powerfully evokes a pristine world that we have lost.

It was a world where yard-long "tortoyses" clambered about Roanoke Island, where great flocks of passenger pigeons blotted out the sky over the Susquehanna, and where huge reefs of oysters filtered the water of Chesapeake Bay, keeping it clear to depths of twenty feet or more. Repeatedly staging the primal scene of tired Europeans discovering "the sheer abundance of life" on a vast continent, Nicholls often asks questions such as this one: "What must it have been like to stand on the shore of the Gulf of St. Lawrence and watch such great flotillas of whales passing so close you could almost touch them?"

Of course this New World was inhabited by humans as well. Indian people move in and out of Nicholls’s narrative. Refusing to sentimentalize their noble savagery, he acknowledges that their relationship to the natural world was complex. They sometimes yielded to the temptations of plenitude, over-fishing and over-hunting certain species at certain times and places. They competed with deer and bear for food, and killed them as they had need of them. They practiced a semi-nomadic agriculture, burning underbrush to clear paths for hunters and creating glades of lush grass with "edge habitats" where beavers and rabbits and other small game flourished. Their park-like forests contained so many fruit and nut trees that the entire eastern woodland region may well have been one big Indian orchard. As Nicholls observes, citing the New England renegade Thomas Morton, the Indians "didn’t live like paupers; they only seemed to do so to English eyes." In fact (though Nicholls does not say this) they resembled what the anthropologist Marshall Sahlins called "the original affluent society," in his classic book Stone Age Economics--a society that satisfied its wants by adjusting them to available resources. This was a form of ecological thinking, sanctioned by a cosmology that emphasized the interdependence of the human and the non-human worlds.

In the natural history of North America, the politically correct view turns out to be also the empirically accurate one. The serpent in this garden was the white man. Overwhelmed by the presence of plenitude, he was seduced into the belief that it was limitless. Christianity and capitalism reinforced that assumption. Throughout the eighteenth century, English ships routinely stopped at Newfoundland to herd great auks on board, where hungry sailors clubbed the hapless birds to death by the hundreds. As one seaman said, it was "as if God had made the innocency of so poor a creature to become such an admirable instrument for the sustenation of man." Admirable, that is, until it was driven to extinction in the early 1800s.

Inhabiting a human-centered cosmos, Europeans could not grasp that the "sustenation of man" might also require the sustenation of animals. Indians had killed their fellow creatures, while they had also recognized their mutual dependence. But with the establishment of the fur trade, animals became commodities: Indians and whites alike piled up beaver pelts to satisfy the demands of distant consumers for the latest fashion in hats. Well before the American Revolution, the globalization of capital was already under way, with catastrophic consequences for the New World Eden.

Heedless waste became normal practice, in whaling as in other extractive industries. During the early nineteenth century, whalers began to deploy a technique they called "blasting," which involved harpooning humpback whales indiscriminately, letting their bloated bodies eventually float to the surface, and then recovering about a third of them. The other two-thirds were left to rot. Such methods took a toll: the apparently limitless supplies of whales and oysters, sturgeon and cod, gradually revealed their limits. By the mid-1800s, some animal species--great auks and passenger pigeons among them--had disappeared altogether. It was about this time that American nature writing began to take an anti-modern turn, led by Thoreau.

Previous naturalists, from William Bartram to Thomas Jefferson, may have distrusted cities but rarely drew a bead on modernity itself. They were men of the Enlightenment, believers in progress. Thoreau was not. He questioned the fundamental American faith, and became the first of many environmental writers to risk being labeled a tree-hugging wacko. His intellectual progeny include John Muir, Aldo Leopold, Gary Snyder, Wendell Berry, Rachel Carson, Edward Abbey, and Robinson Jeffers, among others--outsiders who have challenged utilitarian definitions of well-being and indeed the very notion of a human-centered cosmos. This is the critique, the philosophical standpoint, at the core of ecological thought; and part of its strength derives from its crankiness, its refusal to compromise with the common sense of the larger society. Indeed, "crank" is a convenient label for any dissenter from received wisdom. This may be why attempts to move environmentalism into the charmed circle of responsible opinion so often slide into technocratic moralism--an idiom that may be powerful in making policy but is anemic in inspiring commitment. Respectability is a crucial component of political impact, but it comes at a price.

The case of Bill McKibben offers an interesting example. Few writers could be more decent or sensible, most of the time. It is difficult to object to McKibben’s arguments for conservation and against consumption, for less watching of television and more walking in the woods. He rightly insists on the reality of limits, in a way that the celebrants of endless growth describe as blinkered piety. This is usually a groundless dismissal, but sometimes the blinkers are all too apparent. They distort McKibben’s view of recent history, aligning him with conventional wisdom and alienating him from potential allies.

Now McKibben has edited American Earth: Environmental Writing Since Thoreau, for the Library of America. It is a rich and rewarding collection--once one gets past McKibben’s introduction, which reveals some of the symptomatic weaknesses in his ahistorical sentiments. Though he includes Thoreau, Muir, and other classic figures, he ignores such eighteenth-century naturalists as Bartram and Jefferson, and admits that he has weighted the anthology toward writing from "the years since 1980--the years since Ronald Reagan took office and the long-standing bipartisan commitment to environmental reform collapsed, the years when the systemic nature of our trouble came more sharply into focus. And the years when a kind of hyperindividualism became not just one strand in the American psyche, but pretty much the sole ideology of a continent." So far, this is reasonably accurate. The trouble starts when McKibben traces the origin of hyper-individualism to "the great carnival of individual liberation that we know as hippiedom." Forget Reagan, James Watt, Milton Friedman, and the other capitalist fundamentalists dismantling environmental policy in the name of personal liberty. The real culprits were the hippies.

This is a grotesque distortion of our recent past, and an odd slander of the counterculture. In fact, the deepest countercultural impulse was rooted in reaction against corporate-sponsored fantasies of limitless accumulation. "Hippiedom"--in urban as well as agrarian settings--was founded on the notion that freedom from the hamster cage of earning and spending required resistance to the accumulationist ethos. To be sure, there was no shortage of hypocritical posturing--many a rock star renounced possessions while he acquired prime real estate and piled up profits. But our collective journalistic memory, by focusing on hip poseurs, has trivialized the significance of certain countercultural values for the legions of non-celebrities who took limits seriously. From the late 1960s through the 1970s, the desire to avoid bourgeois entrapment inspired countless experiments in making do with less, in fashioning ingenious ways to minimize consumption of scarce resources--the sort of ingenious experiments that filled The Whole Earth Catalog. The recognition of limits, for the best (which was not always the majority) of the counterculture, was not a regime of renunciation but a portal of possibility. All of these countercultural tendencies resonated with the anti-modern core of environmental thought.

McKibben’s ignorance of recent cultural history allies him with other caricaturists of "the Sixties," many of whom have little use for his environmental politics.
Born in 1960, he is as eager as most of his contemporaries to locate the source of current ills in the misbehavior of people a decade or so older than himself. "The general ethic of ‘do your own thing,’ of marching to a different drummer--all of which traces back in some way toward Thoreau--came at a cost," he writes. "We were liberated to be hyperindividualists, in a way few humans had ever before been. In the decades since, that liberation has been experienced mainly through consumption." Given the simplemindedness of this capsule cultural history, and its utter failure to address larger questions of political economy, it comes as no surprise when McKibben asserts that "the ultimate vehicle of choice for the ’60s generation" was the SUV. Enough already. Haven’t we had our fill of ignorant intergenerational polemics?

Despite McKibben’s feckless introduction, he has assembled a sterling group of environmental thinkers. Many of them did their best work or reached their largest audiences during the era of allegedly hyper-individualist hippiedom, though McKibben does not acknowledge this inconvenient truth. The popularizing scientists are here--Rachel Carson, E. O. Wilson, Loren Eiseley, with their compelling orchestration of particulars; and so are the lyricists of the quotidian--Annie Dillard, Barry Lopez. Scattered throughout the volume are examples of investigative journalism, recalling the variety and vitality of that disappearing genre. The most memorable is Berton Roueche’s New Yorker account of a toxic fog that killed twenty people in Donora, Pennsylvania in 1948--a reminder, as McKibben observes, of how lethal the industrial atmosphere could be before the Clean Air Act. We are also offered the genteel ruminations of the nature writer Edwin Way Teale, who for all his avoidance of controversy could nevertheless be clear: "The difference between utility and utility plus beauty is the difference between telephone wires and the spider’s web." Teale’s critique of mere instrumentalism surfaces elsewhere in the volume--in the loping absurdity of Gary Snyder’s "Smokey the Bear Sutra," the understated clarity of Wendell Berry’s "The Making of a Marginal Farm," and the deft ironies of Edward Abbey’s assault on "industrial tourism" in the wilderness of southern Utah. Robinson Jeffers, alas, gets only two short poems. The worldview of his poetry, what he called his "inhumanism," deserves more, for its refusal of modernist self-absorption, its rejection of a utilitarian creed, and its alternative to a human-centered cosmos. Jeffers’s oeuvre
is a powerful poetic expression of an ecological ethic: a recognition of the interdependence of all life leads to a deep respect for the moral claims of the non-human world.

III.

To attack humanism in the name of a non-human morality is to place oneself outside the liberal tradition, on the side of nature, in a position that seems lyrical and even mystical, and consequently to risk appearing a crank, or worse. This has long been a problem for ecological thinkers. Liberal humanists, who are rightly vigilant about other, more malign forms of "inhumanism" and inhumanity, have always felt a little skittish around ecological critiques of modernity. Sometimes they seem to see environmentalism as the first step on the slippery slope to soil worship. They can cite disturbing examples on both sides of the Atlantic. Hitler, they remember, was a fervent nature lover. National Socialism celebrated rural rootedness, and Jews (like intellectuals generally) were attacked as "rootless cosmopolitans." Similar racist themes animated conservationists
in early twentieth-century America, as Jonathan Spiro’s biography of Madison Grant reveals. The history of environmentalism is not entirely edifying.

Grant’s ruling-class credentials were in order. Born to a prominent New York family in 1865, a Yale man and non-practicing attorney who lived off his investments, Grant was a founder of the American Museum of Natural History and a member of the Boone and Crockett Club (an elite group of sport hunters who included Theodore Roosevelt, Henry Cabot Lodge, George Eastman, and other notables). A savior of the redwoods, Yosemite Valley, and other natural treasures, Grant deserves to be ranked alongside Roosevelt, Muir, and Gifford Pinchot, Roosevelt’s chief of the U.S. Forest Service, as one of the key figures in the creation of an American conservation movement.

Grant was also a fervent Anglo-Saxon supremacist who popularized the scientific racism of his time in 1916 in The Passing of the Great Race--a compendium of conventional wisdom that sought to sound the alarm over swarming immigrants, rallying its WASP readers to greater fecundity. The book was widely praised (except in The New Republic, where the anthropologist Franz Boas pointed out the flimsiness of the category "race" and observed that Grant’s maps were "entirely fanciful in their details"). Roosevelt sent Grant an admiring letter about the book, and F. Scott Fitzgerald put Grant’s ideas in the mouth of Tom Buchanan in The Great Gatsby. Buchanan was just the sort of truculent, privileged airhead who would have been eager to display his intellect by citing middlebrow race science. The Passing of the Great Race captured the racial hysteria that bubbled barely beneath the surface of American popular culture in the 1910s and 1920s. And it made Madison Grant more famous for his race theory than for his conservation efforts.

Spiro never tires of exclaiming at the incongruity of the two roles, but maybe they were not as incongruous as he thinks.
He describes Grant as an aesthetic preservationist rather than a utilitarian conservationist, but his evidence suggests a complex interchange between the two approaches, and a slippage from aesthetics to utility. In Grant’s early career the aesthetic emphasis dominated, as he shifted his attention from regulating hunting to protecting endangered species to preserving scenery. The concern with scenery was pre-ecological: Grant had no sense of the need to preserve bio-diversity as a basis for a sustainable ecosystem. The language had not been invented yet. But like many of his contemporaries, he did value the sublimity of wilderness as a source of spiritual renewal, and that sentiment helped pave the way to a broader environmentalist outlook, and ultimately to an ecological ethic.

Meanwhile, as Spiro notes, Grant began to use "conservationist means to achieve preservationist ends," arguing that Glacier National Park could be a water supply as well as a spectacular landscape. Grant’s utilitarian preoccupations intensified as he began to glimpse the contradictions inherent in wildlife management. Starting out with a hunter’s desire to protect endangered game, he soon realized that when protected creatures proliferated, they threatened their own habitat. He found himself drawn to the potent phrase "survival of the fittest," which pervaded elite circles by the early twentieth century. Managing the wild, it appeared, would involve selective breeding and weeding out the weak--a kind of animal eugenics. It was only a short step, for Grant, from culling inferior elks to culling inferior humans. Breeding was all.

In spotlighting the connection between wildlife management and eugenics, Spiro has put his finger on something important. The obsession with improving breeding stock linked Grant with Hitler on the right and with other more respectable eugenicists on the left, including Margaret Sanger (who promoted birth control) and Theodore Roosevelt (who hated it). Sanger wanted "to breed a race of human thoroughbreds," while Roosevelt warned Anglo-Saxons against "race suicide." Eugenics sanctified the marriage of racism and modernity. Throughout the 1910s and 1920s, the leaders of the American Eugenics Society dressed in white and paraded their obsessions with purity, organizing Fitter Families competitions, counterposing Nordics against the menace of Jews and other immigrants, not to mention the even greater menace of Negroes. When Grant argued that the health of the body politic required restricting the flow of foreigners to our shores, Roosevelt agreed. "The national gizzard cannot masticate more," he wrote. Grant’s eugenic vision was ruling-class conventional wisdom, consistent with managing immigrants as well as managing wildlife.

Still, despite the prominence of men such as Grant in the early conservation movement, environmentalism was never organically tied to racism. In fact, Grant’s older contemporary Muir, who helped found the Sierra Club in 1892, was ultimately far more influential than Grant. Muir was neither a patrician nor a racist. He partook of the class and racial prejudices of his time--including a comparative indifference to the human inhabitants of the wilderness he wanted to save. But ultimately he was sui generis, as
Donald Worster’s biography reveals.

Few scholars could be better qualified than Worster to assess John Muir’s place in the American environmental tradition. Having grown up on the impoverished margins of agricultural life in the irrigated West, Worster began exploring the relations between humans and nature long before professional historians were paying any attention to the subject. His Dust Bowl (1979) remains the classic work on the great man-made ecological catastrophe of the 1930s. Over the last quarter of a century, he has played the leading role in creating the field of environmental history, producing a series of pathbreaking books on ecological thought and its consequences (or lack of them). Now he has turned his talents to Muir, the iconic mountain man.

Worster wants to cleanse Muir of crankiness, and to claim him for the American democratic tradition. And this argument implies a broader one--that ecological thought, far from being a reactionary assault on modernity, was actually rooted in "modern liberal democratic ideals." For Muir and other ecological thinkers, Worster argues, a concern for human rights, personal liberty, and social equality led to a respect for the otherness of nature and the interdependence of all things. Tocqueville noted an organic connection between democracy and pantheism, and Worster thinks the idea has legs. "Social deference faded in wild places," he writes. "Nature offered a home to the political maverick, the rebellious child, the outlaw or runaway slave, the soldier who refused to fight, and, by the late nineteenth century, the woman who climbed mountains to show her strength and independence." Romantics from William Wordsworth to Frederick Jackson Turner found wildness to be an alternative to familiar hierarchies.

Muir, in Worster’s account of him, was a part of this tradition: "a liberal, a democrat, and a conservationist." Rejecting his Calvinist upbringing, he embraced "a more positive, hopeful view of human nature along with a more positive view of nature." Eager to expand the liberal ideal of human brotherhood to include all sentient beings, he "arrived at moral positions more advanced than many of his contemporaries." His inclusive vision made him part of the march of Enlightenment and progress.

Yet Worster is too smart a historian to overlook the tensions between ecology and liberal democracy, or to ignore the contradictions in Muir’s own life and thought. Worster observes that alongside the popularity of pantheism Tocqueville also predicted the coming of an "industrial aristocracy" to America, "one of the hardest [aristocracies] that have appeared on the earth." Making money in America would often involve indifference to the rights of human beings, let alone bears, wolves, buffalo--and the ground squirrels that flourished in the orchards of California’s Central Valley. When Muir married the daughter of a prosperous California fruit grower and took over the management of the family holdings, he had to elide the fundamental conflict between coercing nature and co-existing with it. When he sought alliances with members of the "industrial aristocracy," he had to overlook their willingness to plunder the wild for profit while they set aside selected areas for preservation.

Muir’s effort to expand the range of liberalism and democracy was constantly confronting the limits imposed by capitalism. Worster understands those limits well. His entire career constitutes a critique of the destructive elements in the ethos of development. Yet as he tries to bring Muir--and ecological thought in general--into harmony with liberal democracy, the strain sometimes shows: Muir cannot be easily melded with the march of democratic progress. This is the only flaw in what is otherwise a superb biography.

Worster brilliantly recreates Muir and his world in all their complexity. His account of the early conservation movement reveals the myriad tensions between management and preservation--for example, the many compromises Muir had to make to enlist the likes of Edward Harriman, railroad magnate and preservationist par excellence, in the ranks of the Sierra Club. Worster understands the deep contradictions between ecological concerns and economic growth. That may be why his effort to assimilate Muir to the Enlightenment project is not entirely persuasive. From an ethical perspective, there is something profoundly appealing about making universal rights genuinely universal, about extending the reach of humanism to the non-human world. But the obstacles to that aim, as Worster’s own account makes clear, are deeply embedded in the ways Americans, including Muir, have thought about nature. The ascendance of an ecological perspective requires more than a Manhattan Project for green technology. It means an end to anthropomorphism, a seismic shift in how we imagine our relation with the non-human world. Muir’s story reveals how even the most clear-headed ecological thinker could find himself backing into compromises with capitalist development.

Muir was born in 1838 on the east coast of Scotland, in Dunbar, a grim fortress of the Calvinist Church of Scotland, where fifty-three pubs served an officially abstemious population. Adam Smith grew up across the Firth of Forth, in Kirkaldy; and the common sense of the locals reflected Smith’s dual emphasis on inbred moral sense and individual effort. In John Muir’s recollection, his mother Ann was almost invisible and his father Daniel was omnipresent, a self-made man who became a successful grain merchant on the high street of Dunbar. Daniel was a domineering, restrictive patriarch; his son often fled his oppressive rule. But he stuck to the high street and never ventured to the port:
he was walled in even when he felt free, a dutiful son who nurtured tender feelings for wild creatures but never had any problem with slaughtering pigs.

Daniel himself was restless. By the time John was ten, his father could no longer ignore what Worster calls "the religious worm that had long been gnawing at his vitals"--the suspicion that no church could adequately represent the gospel of Christ. Deeply mistrustful of all institutions and determined to be his own clergyman, he fell in with the Secessionist followers of Alexander Campbell, who renounced clerical authority and relied on lay preachers. Pockets of Secessionists were settling in North America, where they imagined their egalitarian views would flourish. Fired with dreams of freedom, the Muirs took passage for New York in January 1849.

Daniel headed for Wisconsin, looking for an independent life on the land. Eventually he landed in the woods near Pardeeville, forty miles from Madison. John recalled the "sudden plash into pure wildness" after the gray streets of Dunbar. Appalled (he later said) by the colonists’ wasteful ways, he spent hours in the woods, acquainting himself with blue jays, hawks, woodpeckers, frogs, and snapping turtles. There was more about those creatures in his autobiography than about his two brothers and five sisters. Nature was an escape from his father’s bullying domination. "We were all made slaves," Muir wrote, "through the vice of over-industry."

As Daniel flailed about, overextending himself, buying too much land to farm well, John strayed beyond his father’s influence. No doubt thinking of Daniel’s failures, John listened receptively to a neighbor’s criticism of white settlers’ farming methods, inept by comparison with the aborigines they had dispossessed. He read Alexander von Humboldt’s journals and dreamed of travel. Yearning to be free, he also sought self-mastery, immersing himself in a basement workshop. From scraps of wood and pegs and ropes, he fashioned a variety of machines meant to discipline the body--a collection of cobbled-together alarm clocks and a bed that would dump him out at a specified hour, usually one-o-clock in the morning, so he could get in a few hours of tinkering before his farm chores started.

All this makes Muir seem remarkably similar to Frederick Winslow Taylor, the obsessive-compulsive "father" of scientific management. In Worster’s view, however, Muir’s machines "often had less to do with factory production or rural mechanization than with self-improvement and self-redemption." Committed to Calvinist habits, Muir strained (if only half-consciously) toward what Worster calls "a more liberal, free-form faith and piety." His ultimate destination, if not his precise intention, was a romantic and democratic religion of nature.

For several years his path to that end shuttled between Pardeeville and the state university at Madison, where he enrolled in 1860 and was soon taken up by Ezra Carr, a professor of chemistry who gave the religion of nature a scientific gloss. "Nature is the name for an effect whose cause is God," Muir’s lecture notes recorded. Carr became Muir’s mentor and Carr’s wife Jeanne his feminine ideal. For a scared, strange country boy, they gave the religion of nature a warm domestic habitat.

Madison was a hotbed of Unionist fever, but Muir remained immune to it. The strident nationalism of his fellow students seemed forced, the preservation of the union an abstract and unconvincing excuse for killing other human beings. (To judge by Worster’s evidence, the destruction of slavery was not on the students’ agenda.) And in 1863, when the nationalist fervor faded and the volunteers proved insufficient, the federal draft law allowed inductees to purchase substitutes--making the sectional conflict officially a rich man’s war and a poor man’s fight. That slogan originated in this era and spread quickly among young men without means. Muir was one of them. Lonely and restless and reluctant to be drafted, he finally "skedaddled" to Canada in early 1864, though he was in no immediate danger of being called up.

For two years he wandered through Canada, "a man without any fixed identity or ambitions of his own," in Worster’s words, who melded aesthetics and piety in his developing attachment to the natural world. Along the Hollander River north of Toronto, he came across a field of orchids blooming on a barren hillside in the early June sun. He was thrilled to tears. The scene was "so perfectly spiritual, it seemed pure enough for the throne of the Creator." Still half-attached to orthodox habits of mind, he became convinced that the biblical injunction to subdue the earth by the sweat of one’s brow was not a divine command but a consequence of human perversity. What were weeds, anyway, he asked his journal: "are not all plants beautiful? Or in some way useful? The curse must be within ourselves." In the sublimity of nature, man was a bumbling and destructive interloper.

When the war ended, Muir, like many other draft evaders, returned to the United States, with no moral disgrace or legal consequences. Feeling "utterly homeless" among a transient proletariat, he went to work as a skilled mechanic in a wagon-wheel factory in Indianapolis, where he still harbored Taylorite visions of harmonizing men and machines even while he yearned for wildness. Contradictions multiplied as he fell in with the Hoosier Yankee Samuel Merrill (who later founded the Bobbs-Merrill publishing house) and his family. At one hour in the day he sipped tea in the Merrills’ parlor, discussing Lamartine and religious liberalism; at another he wielded a wrench in the factory, dreaming of ways to organize the work more effectively.

But one day everything changed. Muir lost the vision in his right eye when he poked it with a file, and within a few hours the left eye went out too. He went home to his bare rented room, terrified, slipping into despair. For weeks he could not see. Writing from Madison, Jeanne Carr tried to comfort him, telling him that "God had given him ‘the eye within the eye, to see in natural objects the realized ideas of His mind.’ " Soon the left eye came back and the Merrills found him an oculist, who reassured him that the injured eye would heal itself in a few months. But the temporary blindness set off a psychic transformation in Muir. Jeanne’s transcendentalist urgings had helped to shape his future. When his sight returned, he abandoned the shop floor and all Taylorite dreams of streamlining it. Stopping off in Wisconsin to bid his father a bitter goodbye, he took off for South America, on foot. He carried a notebook inscribed "John Muir, Earth-planet, universe."

IV.

The long walk southward offered Muir an escape from conventional expectations, and an intimacy with the poor folk (black and white) who offered him shelter and--occasionally--a revelation. When he slept on a grave in the Bonaventure cemetery in Savannah, he came to believe that death was not a form of punishment (as in his father’s Calvinism), but a fulfillment of "harmony divine." He also contracted malaria. Nature was a killer as well as a beauty.

A long convalescence in Florida gave him time to turn decisively against anthropocentrism. He gave up hunting for sport and began to extend the language of rights to the moral claims of non-human creatures. Nature, he began to realize, was not made especially for man, and human salvation was not the central drama in the world. Rather it was "God’s slow, inscrutable unveiling of a natural world that existed before and will exist long after the human species." With his sense of his own limitations intensifying, Muir set off for Cuba, where he slipped into a listless torpor amid unrelieved humidity. Spotting an advertisement for steamers to California, he escaped to the cooler, drier place that would be the center of his being for the rest of his life.

He made his way to the foothills of the Sierra Nevada, earning his bread as a farm laborer and a shepherd, trying to leave sufficient time for long hikes and patient observation. The summer of 1869 proved to be, in Worster’s words, "a long moment of ecstasy that he would try to remember and relive to the end of his days." He was struck by a vision of cosmic harmony that recalled Wordsworth’s in the Lake District, but also prefigured the ecological perspectives of the twentieth century: "when we try to pick out anything by itself," he wrote, "we find it hitched to everything else in the universe." Here was an alternative to the rigid Christian dualism of his youth. With the summer over, he signed on as a millwright and carpenter for a hotel owner in Yosemite Valley, built a cabin nearby, offered his services as a guide to the swelling crowd of tourists, and gradually became a local character.

Meanwhile he was immersing himself in the Sierra Nevada. For Muir, as Worster writes, mountaineering became "a way of gathering knowledge about the history of the earth, a pathway to revelation and worship." He would have loved to join the California State Geological Survey, which included the geologists Josiah Whitney and Clarence King, but he lacked the academic credentials. His rambles led him to the new science of glaciology; and before long he had discovered many moving bodies of ice grinding against boulders--active glaciers in the Sierra Nevada. His fame as a source of local knowledge eventually reached Washington, and the Smithsonian Institution requested him to write up reports on his explorations.

Despite Whitney’s disdain for "campfire science," Muir pressed forward, publishing a series on mountain glaciers in the Overland Monthly that was later compressed into an article for the Annals of the American Academy. He rejected the "catastrophism" of Whitney (and the Bible), which assumed rock formations to be the outcome of discrete events, in favor of a "uniformitarian" approach which emphasized slow, incremental (or glacial) change, and required "deep time" to unfold. Muir liked the notion of human insignificance implied by deep time, even if it provoked the hostility of his father, the scriptural literalist. Embracing Darwinian evolutionary theory, he rejected the Social Darwinist obsession with ruthless struggle in nature. The cosmos, however violent, was ultimately "hitched together," harmonious.

For the next decade, Muir made a name for himself as a naturalist--perhaps the last self-taught one to acquire professional legitimacy. His authority depended on attachment to a place. For a while he could not bring himself to leave it, despite offers to study at MIT and to return to Madison. He combined mountaineering and writing for the San Francisco Evening Bulletin, where he celebrated the Sierra Nevada in conventional terms as "Nature’s rest cure." And in 1874, after visiting a salmon hatchery in northern California, he recorded what he thought was a turning point in American history: the recognition that "neither our ‘illimitable’ forests or ocean, lake, or river fisheries are now regarded as inexhaustible." Muir trudged up and down the state, debunking the popular theory that sequoia trees were dying out (so might as well be cut down anyway), and demanding preservation of "God’s First Temples" and regulation of timber cutting. Combining aesthetic preservation and utilitarian conservation, he groped toward a pragmatic vision of trees as a renewable resource--now a cliché of lumber-industry advertising, but in the slash and burn 1870s a genuine breakthrough.

In June 1879, Muir became engaged to Louise (Louie) Strentzel, the daughter of a prominent farmer. Within days after the betrothal, he was off to Alaska, where he witnessed the devastation of the indigenous Inupiat culture, explored more active glaciers, and forged a common bond of fear with a dog named Stickeen as they made their way across a treacherous crevasse. It was another key moment in Muir’s developing empathy for the non-human world.

When he returned he married Louie, and took to tending orchards with her father. Muir embraced the Darwinian perspective that the farmer and breeder should turn to nature for inspiration--that wild species were to be emulated rather than destroyed. But overall, as Worster notes, Muir was "a cautious businessman rather than an agronomic revolutionary." He poisoned squirrels, he boiled insects, he separated farming from wildness. Agriculture appealed to the manager in Muir, to the techno-
visionary. He made more money than anyone else in his family ever had, shared some of it with his siblings, reconciled with his dying father, and fathered two daughters of his own. It was a pastoral interlude in a peaceable kingdom.

As he ascended in status, he began to socialize with some of the upper-class men who were starting the conservation movement--the sort of men who, back east, might join the Boone and Crockett Club. In 1889, Robert Underwood Johnson, editor of The Century, commissioned him to write a series of articles on the high country surrounding Yosemite Valley, urging the creation of a National Park there. When Muir returned from a trip to Alaska, where he had been collecting botanical specimens, both Yosemite and Sequoia National Park had been created by Congress. Preservation and conservation were working in tandem; national parks protected sublime scenery and crucial watersheds. When his father-in-law died in 1890, Muir inherited $235,000 (about $4 million in contemporary dollars) and became a certified member of the landowning elite.

The following year William Ames, a professor of English at Berkeley, proposed the idea of the Sierra Club to Muir, and Muir agreed to serve as its first
president. The bearded prophet, coming down from the mountain, found himself rubbing shoulders with the cultural elite of San Francisco, Boston, and New York, who begged for after-dinner stories of Stickeen. Muir believed that the rich could be useful allies in the fight to preserve wilderness. The railroad magnate Edward Harriman, for example, was not only a boon companion on voyages to Alaska, but also an advocate of expanding national parks.

To be sure, there were tensions among the conservation elite. Appointed by the secretary of the interior to the U.S. Forest Commission, Muir found himself at odds with Gifford Pinchot, who supported sheep grazing on public lands. But Muir considered himself a moderate on economic growth. As Worster remarks, he believed that "abundance could continue if selfish people did not destroy the land’s capacity for regeneration." Amid the rancorous class conflict of the era, he nourished the hope that society could become as harmonious as he imagined nature to be.

His friend Theodore Roosevelt shared that hope. In 1903, Roosevelt invited Muir to join him for a four-day camping trip in Yosemite National Park. In convivial campfire conversation, Muir secured an agreement to make the Yosemite Valley part of the National Park. Roosevelt opposed grazing on public lands and believed "wise use" included aesthetic and spiritual as well as economic values. This made it easy for Muir to overlook Roosevelt’s love of war and hunting. What Muir did not see was that Roosevelt’s cult of manly will could also reinforce his commitment to canonical ideas of progress, leaving him a less than reliable ally in subsequent environmental controversies. Like many men before and since, Muir was simply smitten by TR. Never before, Muir said, had he had "so interesting[,] hearty, and manly a companion." Despite their differences over what constituted manliness, Muir admitted that "I fairly fell in love with him."

The collaboration with Roosevelt marked the apogee of Muir’s public career. By then he had completed two books, The Mountains of California (1894) and Our National Parks (1901), which articulated his mature view of how humans might fruitfully interact with wild nature. As Worster summarizes Muir’s perspective: "We cannot find in nature any soothing escape from history, impermanence, strife, or death. But learning how nature manages that change and how it generates a unified complexity is good tonic for the troubled, careworn human mind." Rejecting sentimental anthropomorphism, recognizing that nature was more than a stage set for aesthetic rapture, Muir nevertheless acknowledged that the wilderness experience could have a therapeutic dimension.

In Worster’s telling, Muir was a prophet of a post-Protestant ethic of nature worship, elevating leisure over work and reverence for the non-human world over humanist hubris. And yet Muir preserved a vestigial attachment to aestheticism, which justified his distinction between ordinary and extraordinary nature. In his view, extraordinary nature--Yellowstone, Yosemite, the Grand Canyon--could not be developed. Only ordinary nature could. One example of ordinary nature was the Klamath Basin in Oregon, a cold and forbidding desert where water collected seasonally in shallow wetlands and lakes, and huge flocks of migrating waterfowl stopped for sustenance. Harriman wanted to build a railroad through it, and the federal government collaborated by authorizing a mammoth reclamation project to convert the Klamath marshes to agricultural land--though the place was unsuitable for farming. The reclamation plan was an ecological disaster. But Muir had no problem with it. Despite his belief that everything in nature was "hitched to everything else," some parts of nature were more expendable than others. His embryonic ecological perspective was compromised by his aesthetic distinction between ordinary and extraordinary nature. Only the sublime could not be spoiled.

The consequences of this inconsistency could be catastrophic. This became apparent in Muir’s losing battle to save the Hetch Hetchy Valley from the city of San Francisco, which wanted to flood it for a reservoir to supply the city’s growing population with water. As the debate wore on, all of Muir’s rich friends deserted him, including Roosevelt. As Worster observes, the arguments pitted development and progress against "an unstable, incompatible mix of recreation and religion that [the opponents of the reservoir] called ‘scenery.’ " In fact, Muir and his allies could have made a stronger case, by demanding a deeper notion of "wise use" than the one deployed by the advocates of the reservoir, by observing that the city had several good and affordable alternatives for expanding its water supply, and by noticing that the Hetch Hetchy project was primarily a means of enriching a few contractors and their cronies. But they left themselves vulnerable to dismissal as romantic aesthetes, and ultimately they were defeated by the ideologues of economic growth.

By 1914, when it became clear that he had lost the battle for Hetch Hetchy, Muir was old, sick, and alone. His wife had died in 1905, his daughters had married men he did not like and had moved away. Struggling to "get something worthwhile off my hands before dark," he tried to write but found it hard to breathe and made forays into climes warmer and drier than the Alhambra Valley in winter. He died alone in a Los Angeles hospital on Christmas Eve in 1914. Admiring obituaries proliferated, and in the decades since his death Muir’s stature has grown steadily.

Worster’s tribute is the most capacious of all. He applauds Muir for extending the idea of egalitarianism beyond the human species and in effect becoming the first "eco-democrat." It is an appealing claim, and argued with convincing detail. Muir’s vision was large and generous, and as necessary in our own time as it was in his. And yet we need to recognize that ecological thought is not simply an extension of Enlightenment values. Sometimes, as in the Hetch Hetchy fight, it involves a fundamental challenge to their conventional formulation--and particularly to the Enlightenment definition of progress.

V.

Human advance, from the Enlightenment standpoint, has always involved the drive toward systematic human control over the non-human world, through the transformation of nature into a manipulable object. Anti-modern critics from William Wordsworth to Wendell Berry have understood the deadening force of this process. Wordsworth’s "we murder to dissect" summarized--succinctly if unsubtly--the romantic case against calculating analysis. But it was left to Max Weber to characterize the consequences of modernity with a formulation both sweeping and precise: what he called "rationalization," or the disenchantment of the world.

For centuries, disenchanters have been transforming the natural world--beavers and buffalo, meadows and streams--into commodities, robbing nature of its independent existence, its capacity to challenge us with its otherness. As James William Gibson writes, "There is power in a buffalo--spiritual, magic power--but there is no power in an Angus, or a Hereford." The powerlessness of commodities, Gibson thinks, arises from their reduced status as standardized, quantitative units of monetary value. This is a central assumption in the Weberian tradition.

The problem with it is that the process of rationalization is never complete: in modern consumer advertising, commodities (not to mention money itself) can be endowed with a shimmering aura, a magical promise of purchasable pleasure and possibly even self-transformation. The magic is not in the Angus, but in its associations with sizzling steaks or pastoral plenitude. Ignoring the faux-magic of consumer culture, Gibson assumes that we live in a spiritually deadened world. He has a point, though the situation is not as simple as he says it is.

In any case, he sees a counter-revolution under way. In A Reenchanted World, he catalogues the myriad signs of a widespread revolt against the dualist tradition of human dominion over nature, which provided the philosophical rationale for commodification. In his hopeful account, longings "to make nature sacred again" are everywhere in our culture, animating everything from such movies as Whale Rider and Free Willy to James Lovelock’s Gaia Hypothesis (the earth is a living organism), the anthropologist Clifford Geertz’s respect for "local knowledge" (the alternative to abstract knowledge), and the geographer Yi-Fu Tuan’s exploration of "topophilia" (the attachment to a particular place). Unfortunately, the discourse of re-enchantment can sometimes sound like the old sentimental tropes. Among Plains Indians, Gibson writes, "hunted animals were sacred game, and meat-eating at the end of a successful hunt was a sacramental meal."

Yet finally this idea is not mere Noble Savagery. There is much historical and ethnographic evidence to corroborate it, without suggesting that Indian people were incapable of violence or waste. The uncomfortable fact remains (uncomfortable, at least, for doctrinaire modernizers) that the core of the re-enchanters’ argument is not mere cant. The poet Gary Snyder puts the central issues in powerfully understated language: "where our civilization goes wrong is the mistaken belief that nature is something less than authentic, that nature is not as alive as man is, or as intelligent, that in a sense it is dead, and that animals are of so low an order of intelligence and feeling, we need not take their feelings into account."

Gibson elaborates on Snyder’s ideas, but he is not an uncritical re-enchanter. He acknowledges the mistakes made in the name of re-connecting with nature--including those of Timothy Treadwell, who lived among the grizzly bears on Alaska’s Gulf Coast and came to believe that he was "a fully accepted wild animal ...brother to these bears," until one of them ate him. Treadwell allowed his sentimental attachment to bears to entice him across a crucial boundary between species. He lacked sufficient respect for the potential violence that is often inherent in a close relation with the wild.

Other disturbing conflicts sometimes surface in Gibson’s survey, as when he reports the consequences of identity politics among the Makah of the Pacific Northwest. Some of their leaders insisted that their "cultural subsistence" required the revival of their sacred hunt for gray whales, which had just been removed from the endangered species list. Despite the protests of environmental activists, the Makah negotiated successfully with the International Whaling Commission and received permission to resume the hunt. Harpoons could not finish the job, so they resorted in the end to a rifle firing .50-caliber
machine-gun bullets. Whether the re-enactment of the hunt re-enchanted the spiritual world of the Makah, even faintly or fitfully, is an open question for people outside the tribe. But to this outsider, the incident suggests the pathos (tinged with absurdity) of many efforts to recapture lost traditions in an uncomprehending, disenchanted world. The task is more easily celebrated than accomplished. Sometimes the loss is permanent, and the tradition is better remembered than recovered.

Gibson’s evidence for re-enchantment can be less than compelling. When a United Nations report in 2005 observes that "the true value of nature" is sometimes "difficult to put numbers on," and that "appreciation of the natural world is an important part of what makes us human," Gibson declares it "another text in the culture of enchantment" rather than what it is: a collection of bromides that threatens no one, a popgun pointed at established power. But on the whole Gibson keeps a firm grasp on his major theme--the restoration of a more egalitarian relationship between humans and the non-human world. He infers its importance from the hysteria of its detractors (Pat Robertson, for example, views the re-enchantment project as nothing less than the wedge of paganism) as well as from the catastrophic consequences of their policies. Under George W. Bush’s Bureau of Land Management, Gibson observes, "the West was to be transformed into an industrial grid of wells, roads, and pipelines," while "mountaintop removal" and "valley fill" were to become standard practice in Appalachia.

Whether this sort of waste can be stopped by a change of administration remains to be seen. Gibson thinks that the Republican right’s anti-environmental coalition began to come apart in 2006, and that the re-enchanters re-asserted themselves. They will have to confront the disabling impact of economic recession on environmental regulation--almost always seen as an enemy of "jobs"--as well as the old canard that they are hopeless romantics standing athwart the thrust of progress. That is where books such as Edward Humes’s Eco Barons can be helpful. Written in the breezy style of celebrity journalism, the book is nevertheless well grounded in the local history of environmental controversy. And it gives environmental activists a makeover without trivializing their struggles.

Echoing the title of The Robber Barons, Matthew Josephson’s classic account of Napoleonic financiers in the first Gilded Age, Eco Barons translates environmental activism into the idiom of business heroism. No longer knobby-kneed nerds in Birkenstocks, Humes’s heroes are resourceful, shrewd, hip entrepreneurs. They include such countercultural capitalists as Douglas Tompkins, the founder of Esprit fashions, who has purchased and preserved huge tracts of temperate rain forest in southern Chile; and Roxanne Quimby, the queen of the Burt’s Bees unguent empire, who has used her negotiating skills to build an odd coalition of vegetarians and hunters against development in the Maine woods.

The book’s most provocative story centers around Andy Frank, an engineering professor at UC Davis who invented and patented the plug-in hybrid car. Humes accompanies his profile of Frank with a quick but revealing history of electric cars in America. The overall pattern boils down to this: the problems with electric cars have always been more political than technological. At various key moments in the electric car’s history, the automobile and oil industries have intervened to block its development. The most recent occasion was in the late 1990s, when Toyota, Ford, and GM, faced with the California Air Resources Board’s demand for a "zero-emission vehicle," produced workable battery-operated vehicles and even set up electric car divisions to market them. "One of the most curious episodes in automotive history unfolded next," writes Humes, "as car companies sought to undermine their own products."

They refused to sell the new cars. Instead they leased the vehicles "after a lengthy, intrusive application process." They spent next to nothing on advertising them. Despite mounting consumer demand, the companies produced only a few thousand electric cars, then began publicly criticizing their performance and shifting their focus to hydrogen fuel cell research--despite the objections of Frank and other scientists, who testified that the batteries were far more reliable and promising than hydrogen would likely ever be. Frank urged that the state substitute "very low emission vehicles" for "zero emissions vehicles," and that the industry turn to research in plug-in hybrids. GM hired him briefly to turn its battery-pack car into a hybrid, but then decided to kill the whole electric program, fired Frank, and turned to building Hummers. Other carmakers got on the SUV express, and electric cars piled up in junkyards.

Like Gibson, Humes thinks the political shift of 2006 represented a return to green thinking. Frank’s designs have been rediscovered by local governments; Washington state wants a fleet of plug-in hybrids based on his model. Frank’s company, Efficient Drivetrains, Inc., is much in demand, especially in Asia--but not in Detroit. The Detroit carmakers have developed plug-in hybrids, with specifications that fall short of Frank’s. He questions the depth of their commitment, and presses on.

The history of electric cars is a green parable for our time. It raises subversive questions about roads not taken. It shows that, without adequate public backing, green entrepreneurs--no matter how shrewd--cannot successfully buck the corporate consensus. And above all it challenges the fundamental dogma of development, technological determinism. For decades if not centuries, critics of development have been told that the capitalist (and for a while, the socialist) version of progress is simply unstoppable--a neutral, inevitable, and beneficent process that is beyond politics and policy debate. For a moment, in the forgotten 1970s, this dogma came under scrutiny. But the cyber-revolution of the last thirty years revived it. Techno-determinists from Thomas Friedman to Bill Gates have repeatedly told us that we must choose to do what we have to do anyway--
re-organize our lives in accordance with the dictates of technology. The rhetoric of inevitability conceals the business interests it serves, and negates the possibility of challenging them.

But the tale of the electric car decisively undercuts this determinist mythology. Humes’s account reveals that technological progress is not the product of some irresistible demiurge called "modernity"; and that human beings have the capacity to direct technology rather than merely genuflect to its force; and that in fact the very definitions of progress can be challenged and changed by cranks who resist conventional wisdom. But only--it should be clear--if the cranks have a shot at some money and some power.

Jackson Lears is editor of Raritan and author, most recently, of Rebirth of a Nation: The Making of Modern America, 1877–1920 (HarperCollins).