You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Unquiet Ghosts

We’re Haunted by the Economy of the 1970s

Politicians and pundits across the political spectrum warn of a return to the decade of stagflation, urban decay, and labor mutiny. Their solution? The failed policies of the 1970s.

In 1688, the Swiss doctor Johannes Hofer identified a new disease “originating from the desire to return to one’s native land.” Sufferers sank into a chronically “sad mood,” he wrote, their faces lifeless and haggard. Hofer called the new illness “nostalgia,” fusing a pseudoscientific neologism out of the Greek words for “return” (nostos) and “pain” (algos). A disease of the imagination that eventually infected the body, nostalgia, as Hofer described it, caused its victims to lose all sense of time; the boundaries between past, present, and future, the real and the imaginary, broke down. In the centuries since Hofer’s now-discredited discovery, nostalgia has detached from its scientific roots, morphing from a curable ailment into a chronic cultural condition. Temporal disorientation and historical delusion are now the province of politics rather than medicine. But if there’s a condition that defines political debate in the United States today, it’s not longing for the past but fear of it—and fear of one decade in particular. The victims of this new illness enjoy a relationship to history every bit as confused as that of Hofer’s nostalgics.

The Republican response to Joe Biden’s State of the Union address was just five sentences old when this new phobia scored a mention. “Instead of moving America forward, it feels like President Biden and his party have sent us back in time—to the late ’70s,” Iowa Governor Kim Reynolds said in March. The 1970s: decade of stagflation, impending nuclear holocaust, labor mutiny, urban decay, on-screen chainsaw massacres and off-screen serial killers, unaffordable meat and unavailable gas. After decades in which politicians of all stripes have held the postwar years up as an example of progress toward a more perfect society, the reigning political mood across the United States and the West today is not nostalgia but nostophobia, not veneration of the “New Deal order” or a fictitiously harmonious 1950s, but fear—even, on occasion, terror—of a return to the 1970s. Trump loyalist Peter Navarro has claimed that “a feckless Team Biden has set America up for a new round of 1970s-style stagflation,” while historian Niall Ferguson recently argued that “the 2020s could actually be worse than the 1970s.” But it’s not only right-wingers raising these fears; what’s remarkable about the new nostophobia is how widely shared it is. The World Bank has warned of a regression to the 1970s, and Wall Street is now convinced we’re in for a stagflationary shock. Across politics, business, and the media, the 1970s are universally seen as the decade to which “no one wants to return,” as Treasury Secretary Janet Yellen has put it. When Yellen and Navarro can make common cause, you know that something truly unusual is afoot. 

The proximate cause of this new epidemic of panicky historical analogizing is inflation. Supply chain disruptions triggered by the pandemic and Russia’s invasion of Ukraine have forced consumer prices in many leading economies to climb to heights not seen in four decades. Trailing in the wake of this resurgent inflation have come a thousand think pieces and research papers warning of the risks of a return to the last decade scarred by serious price hikes—a decade in which inflation, we are told, paved the way for crime and riots, a societywide breakdown in law and order. Google searches for the words “stagflation” and “1970s” reached their highest level ever between April and June of this year. The 1970s have become a living ghost that haunts the post-pandemic United States.

Taxi Driver’s anti-hero, Travis Bickle, called the society of the 1970s “sick, venal.” Today’s nostophobes endorse this historical caricature. In their view, the 1970s were a “disaster” (Council on Foreign Relations fellow Sebastian Mallaby), a “horror movie” (Yale economist Stephen Roach) when postwar prosperity “collapsed into a tumult of social unrest,” as The Wall Street Journal’s former editor in chief Gerard Baker has put it. Over the course of the decade, America’s cities became “hell­scapes” (Baker again), and the whole country succumbed to an “existential sense of peril and failure” (ditto). Seventies syndrome has touched all the usual suspects in the business media and conservative policy circles: By now, the opinion pages of the Journal and Bloomberg are so thoroughly cloaked in the historical murk of the 1970s that they could qualify as an early Scorsese film. Across the Atlantic, rising energy prices have made reference to Britain’s 1978–79 “winter of discontent,” marked by bitter cold and nationwide strikes, a cliché of financial coverage. (Never mind that British workers are nowhere near organized enough today to demand the kind of wage increases secured during the 1970s; all that matters in these analogies is vibes, the idea that the 1970s were bad.) Economic history has become an essential weapon in the free marketeers’ fight against the Biden administration: “The 1970s demonstrated what the socialist playbook of tax, spend, and regulate brings: joblessness, inflation, and misery,” argued a recent research paper by the conservative Heritage Foundation. But nostophobia has also gripped moderates and grandees of the Democratic establishment: Larry Summers, perhaps the most influential economist of the Clinton-Obama years, has repeatedly warned that “we’re in very serious danger of repeating almost all the mistakes of the 1960s and early 1970s.”

For much of 2021, a debate raged among economists about whether post-pandemic inflation was transitory or sticky. As it’s become clear that elevated prices will be with us for some time, the nostophobes have become at once defensive (there’s now a cottage industry advising on “investments that worked in the 1970s”) and urgently didactic. The Federal Reserve must learn the lessons of the dreaded ’70s and resist the urge to ease monetary policy too soon, former Fed Governor Frederic Mishkin recently advised, while World Bank economists have pressed governments across the globe to “get ready to weather the storm associated with the tightening cycle.” In June, Summers called for “two years of 7.5 percent unemployment or five years of 6 percent unemployment or one year of 10 percent unemployment” to curb inflation, reflecting a widely held view among those spooked by the ’70s. In the scaremongers’ version of history, the 1970s represent the worst of all postwar worlds. As they tell it, inflation and economic stagnation triggered a society-wide slide into the abyss, arrested only by the heroics of Federal Reserve Chairman Paul Volcker and the election of Ronald Reagan. Only when ­Volcker’s Fed began its ascent up the Everest of early-1980s interest rates, and Reagan embraced the deregulatory, tax-slashing bonanza of supply-side economics, we are told, did the fever afflicting the United States break.

Nostophobes urge lawmakers to learn from the 1970s—an admirable objective—but recommend all the wrong lessons. Fear of the 1970s grounds the demand for 1970s-style solutions to the world’s economic malaise. Remarkably, that fear is working. The Federal Reserve now appears determined to sacrifice employment on the altar of price stability, mimicking its own actions of more than 40 years ago: Fed Chair Jerome Powell’s speech at Jackson Hole in late August explicitly endorsed the 1970s as an object lesson for shaping economic policy today. Earlier, in May, Powell had said the Fed would continue raising interest rates in order to “get wages down.” This is both cruel, since it’s likely to push the economy deliberately into recession, and nonsensical, since rising wages aren’t the cause of inflation: The weakness of labor today means that, on the path to a 1970s-style “wage-price spiral,” we’re barely a single rotation up the fusilli. Coherence is usually the first victim of any historical analogy, and in this case the American worker seems likely to become victim number two. Fear of the 1970s has so thoroughly colonized the contemporary political imagination that we’re now set for a revival of the ­Volcker years.

Nostophobia papers over the damaging legacy of those years while distorting the truth of the 1970s, which were also the decade of worker power, a rising ecological consciousness, and grassroots political experimentation. Most importantly, it obscures the degree to which the unresolved questions of that decade still bedevil the West. A return to the 1970s is in some sense impossible, because the 1970s never really ended. We still live in the world that decade created: a world of fiat money, plentiful credit, serial asset bubbles, and suffocating debt, in which society is subordinate to the economy, markets take precedence over democracy, and capital controls labor. The central drama of the 1970s was about conflict over the distribution of resources under conditions of scarcity. The solution governments across the developed world embraced was to outsource adjudication of the issue to the market, essentially effecting a divorce between the economy and democracy—a divorce whose spawn is the inequality that represents, outside climate change, the West’s greatest problem today. However much we might want to avoid a return to the bad old days of the 1970s, that time continues to define us. The 1970s were, in the words of one historian, the “pivotal decade” that sealed a societywide transition from industry to finance, factory floor to trading floor, production to consumption, and welfare state to real estate, propelling us into a future of destabilizing inequality. The real story of the 1970s commands not fear but study: The path out of injustice and polarization today is through engagement with the missteps and unexplored openings of that decade, not flight from it.

The United States is no stranger to the politics of fear. But usually it’s immigrants, the poor, or the foreign adversary of choice (today, China) that parties exploit as a source of anxiety to boost their electoral fortunes. Fear of the past represents something genuinely novel in the postwar political landscape—and since it cuts across political lines, it can’t be explained away as a simple tool of partisan warfare. Nostophobia’s rhetorical appeal lies, in part, in the distance of the 1970s from today. “Remember the 1970s?” asked one business columnist last summer, as inflationary fears took off. The answer to that question for the majority of Americans is surely “no”: The median age of the U.S. population today is 38.8, meaning that the experience of inflation is new for most people in this country. If the younger generations have any connection to the 1970s, it’s through culture, which memorializes that decade as an era of heroin, disco, dirt, and excess. This demographic skew confers a particular spectral power on the 1970s as a source of popular fear. Beyond the lived experience of most Americans, but close enough in time that many of its survivors—baby boomers and older Gen Xers—now enjoy seniority in politics, economics, and the media, the 1970s function as a perfect all-purpose cultural bogeyman, remembered and unknown in roughly equal measure.

The rise of nostophobia may also reflect a shriveling of hopes among Americans today for the future. Post-independence America was conceived on anti-nostalgic foundations, as a nation that looked forward rather than back; Jefferson exhorted his compatriots to reject the “blind veneration of antiquity, for customs and names to overrule the suggestions of our own good sense.” Even as that discipline crumbled in the late twentieth century and American culture succumbed to sentimentality for the “wonder years” of the postwar era, political nostalgia remained fundamentally restorative and hopeful, mining the past as a project for the future. Reagan’s plan, mirroring the cultural irredentism of the constitutional literalists who gradually assumed control of the Supreme Court, was to return to the “grand prosperity” of the postwar years, as he put it in a speech on the eve of the 1980 election, when “politics was a national passion” and “the center seemed to hold.” He wanted to make America great again, not make America great for the first time. The present may have been bad, but there was always the past to look forward to.

Forty years later, that optimism has been drained from the body politic: Most Americans today expect income inequality to widen and living standards to decline or stagnate over the next 30 years. The rapid acceleration of climate change has extended the province of collective dread to the planet as a whole. Now inflation—which increases the pressure of time, intensifies anxiety about the future, and “redistributes fortune’s favors so as to frustrate design and disappoint expectation,” as Keynes once put it—has reemerged, after its decades-long hibernation, to compound the gloom. Nostophobia is the morbid symptom of a society that’s stuck—culturally stagnant and economically self-cannibalizing. Wedged by anxieties in all directions, fearful of what’s passed and terrified by what’s next, Americans no longer have anywhere to look for inspiration. Not even the past will save us now. Unless, of course, it will. America’s stagnation today is a product of everything that has happened since the 1970s, a point that the nostophobes—rushing to tar Joe Biden by association with the Bad Decade—conveniently ignore. In their Whig version of history, the 1970s nation moved from chaos to order, Carter-era decline to Reaganite rejuvenation, failure to success. Reality was much more complicated.

The inflation that gripped the United States through the 1970s was, without question, a source of deep collective suffering. “In the last few months,” reported The New York Times in February 1974, “people in the metropolitan area have become increasingly suspicious and angry, insecure, devious, often violent and seldom resigned, all because of the lack of gasoline.” Meat boycotts and “baloney rallies” swept across the country; sales of horsemeat soared, and protesters ate dog food to illustrate the deterioration in nationwide economic security. Snaking gas lines became the everyday representation of a chain of events that appeared to signal the undoing of the American century: military defeat in Vietnam; Watergate; the decay of old urban centers in the face of white flight to the suburbs; and the government’s 1971 decision to terminate convertibility of the dollar to gold and initiate a precarious global regime of free-floating fiat currencies. The crisis that the United States endured in the 1970s was not simply economic, but existential. The sudden prospect of an end to growth, following decades of abundance, intensified the debate about what to do in response, giving it the character of a choice between everything or nothing, civilization or barbarism. At stake was the very legitimacy of the institutions that had come to define and ensure postwar prosperity: the presidency, the military, and the state. In the mid-1970s, with year-over-year inflation running at 11 percent, Federal Reserve Chair Arthur Burns declared that the inflationary situation “would threaten the very foundations of our society.”

The 1970s may have been chaotic, but the chaos made sense. Conflict sprang from a genuine debate over the distribution of resources that stagflation—the poisonous combination of runaway inflation and stagnant economic growth—made necessary. Fear of a return to that time is also a fear of confrontation, which is the essence of politics. Perhaps the most valuable work of economic history published since the turn of this century is Capitalizing on Crisis, sociologist Greta R. Krippner’s account of how the U.S. political establishment turned to finance from the 1970s onward as a solution to social discord. Krippner argues that slowing growth and rising prices created an acute distributional dilemma for the state. As public revenue dwindled, competition for it intensified. Initially, inflation allowed the state to defer making a choice between competing claims on the government budget, since it implicated every group in what Krippner calls a “game of leapfrog”: As companies passed on the rising cost of production to consumers in the form of price hikes, workers could apply pressure for higher wages, which would then further deteriorate corporate profits and provoke further price increases, leading to further wage increases, and so on. For governments in the 1970s, inflation represented an underhand solution to the distributional conflicts created by the end of growth. It offered “a surreptitious way for the state to say ‘no’ when it could not do so openly,” Krippner writes: “Social expenditures could increase in nominal terms while rising prices eroded the real value of these claims.”

The wage-price spiral that ensued throughout the 1970s, sending inflation even higher, did not reflect the labor movement’s indiscipline and irresponsibility, as conservative critics at the time pretended, but its strength. There were 289 major work stoppages on average every year throughout the 1970s; the years since 2010, an era of drastically reduced union representation, have averaged 15. The inflation of the 1970s was the stage for the last great contest between capital and labor. As unions won further concessions for workers, big business began to feel besieged; many top executives believed, as the authors of a 1976 volume on “the crisis of confidence in American business” wrote, that “the have-nots are gaining steadily more political power to distribute the wealth downward. The masses have turned to a larger government.” When today’s nostophobes evoke the chaos of the 1970s, what they’re really warning against is a resurgence in the power of labor, in the rule of the limited workday, the inflation-linked wage, and the protected vacation. Their true fear is a fear of workers.

The labor movement could not sustain its gains indefinitely throughout the 1970s. The use of rising prices as a solvent for distributional struggle only worked for a time; eventually the inequalities of inflation, which always hurts the poor more than the rich, became clear. The position of corporate elites, already anxiety-ridden about the wage concessions extracted by unions, made a bad situation worse. The postwar compact between the public and private sectors was exceptionally generous to business: The state bore the cost of supplying the physical infrastructure and training the workers that allowed enterprise to flourish, but profits were entirely privatized. At the same time, corporate success generated social and environmental costs that affected society at large, triggering countermovements and social conflict that called for further remedial spending by the state. The channel through which the state paid for all this was, of course, taxation. Social harmony remained intact as long as the economy kept growing; growth kept government revenue and corporate profits buoyant.

But as postwar affluence tanked in the early 1970s, the private sector became increasingly resistant to government spend­ing and the taxation needed to sustain it. The “new field of class struggle,” wrote sociologist Daniel Bell in 1974, “is tax conflict.” The ire of the corporations fell on the public sector and welfare expenditure. Government agencies had mushroomed in size since World War II, while social spending became important to maintain the domestic peace as inflation, fueled in part by the calamity of the Vietnam War, rocketed beyond control. With tax increases becoming politically problematic, the U.S. government relied increasingly on deficit financing to keep the public sector and welfare state afloat, adding to inflation and further exacerbating budgetary strains. The state gained legitimacy by satisfying the fiscal claims of both the private sector and the people. When growth and public revenue dwindled throughout the 1970s, a structural contradiction at the heart of this legitimation exercise became increasingly clear: In the short term, the state could use deficits to satisfy both business and the people, but in the long term it could not keep both groups happy simultaneously. As the decade progressed, the social and economic crisis of inflation expanded into a broader crisis of democratic legitimacy. A choice—between business and the people, money and democracy, profits and peace—was necessary.

That this choice was eventually settled in favor of business, money, and profits was by no means preordained. The danger felt by corporate interests throughout the 1970s—their sense of a working population on the march, squeezing wage and spending increases from a political class unable to afford either—mirrored a kind of fatalistic melancholy among many conservatives. Right-leaning intellectuals such as Samuel Brittan and Samuel Huntington adopted the vocabulary of structural Marxism, lacing their writing with portentous references to the “internal contradictions” endemic to democracy. Globally, the intellectual contest between welfarists and free-market neoliberals, who’d spent the decades since the passage of the New Deal plotting a return to power, was undecided. Friedrich Hayek’s receipt of the 1974 Nobel Prize in economics is sometimes presented as a harbinger of the American political class’s eventual turn toward the market in the late 1970s, but what’s often forgotten is that Hayek shared the prize that year with Swedish social democrat Gunnar Myrdal. Myrdal’s ideas about the interplay between economics, politics, and culture pointed in a radically divergent direction, toward the internationalization of the postwar welfare state. In the mid-1970s, the future was very much still up for grabs. Down one road lay the Hayekian nirvana of fully privatized money and the pricing of everything; down the other lay the dream of a more equal world built on comprehensive reform of the international economic system. Indecision among the country’s leaders—in a bid to tame inflation, Nixon introduced price controls, a move that seems unthinkable for a Republican leader today—underscored the sense that inflation, despite its pain and privations, had rendered politics more plastic.

The left was particularly alive to the possibilities that the stagflationary crisis created. The pessimism of the conservatives “should be the optimism of everyone else,” one radical wrote in 1975. “If ruling groups feel they are losing power, it is only because everyone else is gaining it,” he argued. “Their fears are in reality our opportunities.” The 1970s became a laboratory for grassroots political experimentation: This was the decade of the consumer advocacy group, the workers’ cooperative, and the block association. Local self-help initiatives like homesteading—which saw low-income people take over abandoned buildings and turn them into owner-occupied cooperatives—multiplied as Americans sought creative remedies to the traumas of stagflation. In parallel, fears of impending ecological collapse—stoked by the end of cheap oil and the catastrophism of bestsellers like The Limits to Growth and The Population Bomb—took the environmental movement mainstream. The 1970s birthed Earth Day and Greenpeace, while utopian fictions like Ernest Callenbach’s Ecotopia attempted to map how a modern society built on equality, ecological balance, sustainability, dignity in labor, and direct democracy—the fever dream of the endlessly perfervid 1970s left—might be organized. Instability bred ingenuity.

That flowering of the progressive political imagination at a local level, however, failed to translate to the national stage. In the face of stagflation, Democrats—who controlled Congress for the whole of the 1970s and the White House for the decade’s final three years—muddled through with an irresolute mix of policies that sat somewhere between interventionism and retrenchment, a reassertion of the state and a hesitant placation of business. Resistance to the use of shock methods to eliminate inflation ran deep among liberal economists; Robert M. Solow, the godfather of neoclassical growth theory, felt that engineering a recession to bring prices under control would be “possible but disproportionately costly. It is burning down the house to roast the pig.” The liberal establishment, despite conservatives’ specious regular charge that it was the largesse of the Great Society that had triggered inflation, remained attached to Keynesian demand management. There was much talk of an “incomes policy” to manage the crisis, which would have involved some grand bargain between wage restraint and workplace democratization; as historian Tim Barker has shown, many liberals felt the path out of the distributional dilemmas of the 1970s went through “more extensive government controls and planning,” deepening rather than constraining the influence of democratic politics over the private sphere.

Several of Jimmy Carter’s pet policy plans—a consumer agency, national health insurance, a federal jobs guarantee—bore the imprint of this residual Keynesianism, but none of them came to fruition. The measures that Carter did implement—the removal of interest rate ceilings, the appointment of Volcker to lead the Fed, and a regressive tax reform—revealed the growing clout of the business lobby. Faced with a choice between enhancing democratic participation in the distribution of wealth and leaving the private sector as referee, Carter eventually chose the latter. But his message remained confusing. The call for national sacrifice in his infamous 1979 malaise speech came on the heels of deregulation of the airline industry, which inaugurated a thrilling new age of cheap travel: The United States was on vacation one day and huddling by the space heater the next. The economy continued to falter through the late 1970s, amid a general deterioration in productivity, the dollar, and the balance of payments. Floundering liberals, immobilized by their fealty to the old way of doing things and “consumed with the tinkering necessary to stabilize the economy,” writes historian Judith Stein, “allowed conservatives to fill the intellectual void with solutions.”

For all its sense of cultural persecution, the 1970s business establishment understood, similarly to those on the left, that the traumas of inflation also represented an opening. On both sides of politics, uncertainty mixed with opportunism: “Widespread public concern over inflation,” counseled a Ford administration official in an internal budgetary memo, “could build a consensus for action that might otherwise meet with powerful opposition.” As labor asserted its claims, voices advocating a return to the pre–New Deal liberalized market were gaining strength. In their view, the contradiction that stagflation laid bare was not economic but political. Hayek felt that the social chaos unleashed by the crisis was the “unintended consequence of our current system of unlimited democracy.” An influential 1975 report, co-written by political scientist Samuel Huntington and titled The Crisis of Democracy, found that the United States and Western Europe were afflicted with a “problem of governability.” Historically, Huntington wrote, democracies have “always had a marginal population, of greater or lesser size, which has not actively participated in politics.” These exclusions, while undemocratic, had “enabled democracy to function effectively.” In the years since World War II, however, “marginal social groups” that had previously been “passive or unorganized”—women, ethnic minorities, immigrants, etc.—became enfranchised as full political subjects. This democratic surge represented a boon for representation but a challenge for “governability,” as previously marginalized social groups began to overload the state with “demands that extend its functions and undermine its authority.” The United States and Western Europe had not become ungovernable because they were capitalist, in other words, but because they were democratic. Democracy was suffocating under its own weight.

This unapologetically reactionary framing of the stagflationary crisis—shared by many in the neoliberal vanguard and big business—created an opening for the commercialization of all aspects of private and public life. Membership of the U.S. Chamber of Commerce more than doubled over the course of the 1970s, as acolytes of free enterprise rushed to the front lines of class conflict. The whole thrust of Hayekian liberalism, as an intellectual and political project, was not simply to install market values—competition, private property, the primacy of the price mechanism—at the center of public life; it was to “undo the demos,” as political theorist Wendy Brown has written, to protect the market from popular politics. Laissez-faire meant less democracy.

Unable to decide what to do, the political elites of the 1970s decided to do nothing—or rather, they consciously defanged the state, devolving authority on questions of public resource allocation to the impersonal and supposedly neutral market. Instead of litigating the competition for dwindling state resources in public, a process that would have intensified democratic participation and helped forge a new social consensus, governments across the developed world turned away from the public sphere and toward the market, effectively outsourcing the adjudication of competing fiscal claims to the unaccountable realm of finance. Depoliticization became the new way of politics. In 1967, Eric Hobsbawm called economist Friedrich Hayek a “prophet in the wilderness.” By the late 1970s, Milton Friedman could assert that “the tide is turning” in the Hayekians’ favor. Slowly but surely, free-market liberals asserted their control over the historic course of policy.

Carter’s appointment of Volcker to helm the Fed—effectively caving to Wall Street’s demands for an attack on inflation without regard for the social costs—marked the start of a decades-long partnership between finance and the Democratic Party that has continued, with only minor equivocation, to this day. Inflation now called for technical rather than political solutions; it was “always and everywhere a monetary phenomenon,” to use Friedman’s famous description, a matter for the experts. But the neoliberal fix contained an inherent contradiction from the outset, as Daniel Bell noted at the time: Enthroning the market as ruler of competing social claims during a time of scarcity and stagnant growth would do nothing to resolve underlying distributional conflict. This conflict was inherently political; without political resolution, social grievances would only fester and multiply.

Marketizing the contest for money allowed states to evade the dilemma of legitimation. But for all the neoliberals’ confidence that the market would administer the stiff medicine that a reckless, hedonistic, entitled population needed, deregulated capital turned out to be a surprisingly lax master. Among the most significant reforms initiated in response to the crisis was the Carter Congress’s 1980 repeal of Regulation Q, the New Deal–era ceiling on the interest rates that could be paid on consumer savings deposits. Instead of disciplining spending, this caused capital to explode, as credit flowed freely across the economy. As Krippner has shown, the deregulation of finance throughout the United States in the late 1970s and early 1980s upended its own foundational political logic: Capital was no longer limited but abundant, transforming political restraints.

In November 1980, Paul Volcker, chair of the Federal Reserve, spoke at a meeting of the House Domestic Policy Subcommittee.

The Volcker Shock—which saw short-term interest rates climb to almost 20 percent by 1981, plunging the United States into recession—turned this hose of capital into a torrent, intensifying the financialization of the economy that had been initiated by Nixon’s deregulation of global currency markets and extended by the repeal of Regulation Q. Inflation was drawn from the real economy to the capital markets, but not all price hikes are created equal; whereas goods inflation was seen as a mortal threat to civilization, asset-price inflation symbolized an economy at the top of its game. Superficially, this regime of loose credit and high interest rates solved the distributional riddle of the 1970s. In reality, it threw money at a problem that was inherently political. Reagan, once an advocate of fiscal restraint, understood that Republicans could become “the party of more,” as a GOP strategist put it in 1978—and before long the Democrats copied them. A highly politicized decade of economic anxiety gave way to a new era of financial exuberance, in which politics was sidelined by the magic of credit, and a surging market promised to lift all boats.

As a strategy of avoidance and delay, marketization worked—for a time. Credit and household debt surged, asset markets exploded, the Berlin Wall came down, and Mikhail Gorbachev enjoyed a meal at Pizza Hut. Global financial capital and consumer spending—the new post-democratic outlet for dissent and self-realization—entered into a symbiotic embrace. The citizen may no longer have been sovereign, but the consumer was. A whole new class of technocrats, consultants, and union busters was willed into existence as monetary dominance became the organizing principle of relations between the people and the state. For ordinary Americans, participation in the markets—through the loosening of consumer credit—replaced participation in the political process, while political decision-making, now the preserve of “apolitical” technocrats, was insulated from popular pressure. These moves initiated a collective estrangement from politics and a distancing between elites and the people that have only intensified in the years since, fueling resentment and boosting the electoral appeal of those hostile to the very idea of equality. Once the phony prosperity of the late twentieth century evaporated, the true face of neoliberalism—as a machine for the permanent upward distribution of wealth—became clear. Growth and wages across the developed world have stalled while asset values continue to soar, entrenching a highly destabilizing and seemingly intractable regime of wealth inequality—a regime that climate change, with its prohibitive adaptation costs and unequal distribution of pain, will make even worse.

Today’s fear of the 1970s is a call for more of what the 1970s gave us: comatose wages and foamy financial returns, monetary dominance and political seppuku, the engineered immiseration of the poor for the benefit of the rich. In every business column sententiously advising today’s Fed to heed the lessons of the 1970s, it’s hard not to detect a gruesome thirst to be ringside as our monetary masters “spill blood, lots of blood, other people’s blood,” as Reagan’s economic adviser Michael Mussa memorably described the impact of Volcker’s shock therapy. Nostophobia extends the 1970s strategy of deflection. As an argument for business as usual, it endorses the divorce between democracy and economics, and absolves the political establishment of responsibility for its 50-year failure to find solutions through politics to the problems of resource allocation.

But another way, surely, is open to us. Savvy contemporary observers understood that the market provided no durable answer to the crises of the 1970s. At a time when the great systemic rival to capitalism was still viable as a political life-form, some felt, in the words of economist James O’Connor, that “the only lasting solution to the fiscal crisis is socialism.” Those of a less radical bent tried to find an accommodation within the boundaries of the existing liberal state. In the pages of this magazine, political theorist Michael Walzer called for “civism,” a new spirit of solidarity and shared decision-making that would entail a mutual acceptance of limits. However, the deepest analysis of a possible political solution to the political problem of resource distribution under persistent scarcity may have emerged from the pen of Daniel Bell—who described himself, in high Whitmanesque style, as a political liberal, economic socialist, and social conservative.

In The Cultural Contradictions of Capitalism, his 1976 study of the conditions that gave rise to the stagflationary crisis, Bell argued that the traumas of scarcity called for a new public philosophy. A political economy of the “public household,” as he called it, could define economic rights, establish the social needs that the state must satisfy, and provide a normative framework for the resolution of conflicting monetary claims: “It establishes the public budget (How much do we want to spend, and for whom?) as the mechanism whereby the society seeks to implement ‘the good condition of human beings.’” This is, to be sure, a utopian proposal, the implementation of which would undoubtedly face its own share of legitimation challenges. In recent years, other ideas put forward for reform have captured the spirit of Bell’s public household: financial citizenship, public banking, novel institutional setups to convert central banks into laboratories of open democracy. Economic historian Stefan Eich has floated the idea of a distinct democratic body to direct monetary affairs.

Whatever the merit or feasibility of these individual plans, their basic orientation is correct: There will be no end to inequality without democracy being invited back into the tent of economic decision-making, without a new social contract to organize contestation over the public purse. Nothing about this reintegration of politics and economics promises to be clean; on the contrary, a democratization of policy will likely be disorderly and potentially chaotic, reopening the distributional struggles of the 1970s. In the face of a neofascist right, this whole exercise may seem fanciful, absurd. But Trump is nothing if not an electoralist; democratizing economic policy promises an institutional outlet for populist anger, an arena for the transparent litigation of the material anxieties that fuel his movement. If a more equitable and harmonious society is our aim, we must be prepared to risk and manage distributional conflict. What’s required, above all, is a collective effort of the political imagination—the type of creativity and leadership that the 1970s so badly missed.

To revisit the history of the 1970s in detail might seem tangential to our problems today. But that history—the 1970s as the first great crisis of postwar U.S. power—is the history of our current moment. Neoliberals’ constitutive disenfranchisement of voters and all those without a foothold in the asset economy is very much still with us. The realm of monetary affairs remains fully de-democratized, and, in the short term, the distance between economics and politics—fueled by voter suppression, electoral map manipulation, the casualization of labor, and intergenerational wealth immobility (in the United States today, material success mostly depends on whether you grew up rich)—seems likely to grow. The critical issue of the stagflationary era is still the critical issue of ours: How can political freedom and economic security, democracy and the economic conditions of a dignified life, coexist? If anything, these questions are even more urgent today than they were in the 1970s: Growth has now been stagnant for decades, not years; the emerging markets won’t extend their line of credit forever; and the mismatch between economic globalization and political localism—which Bell called a structural “time bomb”—is widening.

Stoking fears of a return to the 1970s helps obscure the long wreckage of the free-market orthodoxy that decade ushered to power. It ignores the era’s true lessons: its invitation to think carefully about distributional conflict, its commitment to political and cultural experimentation, the possibilities it left unexplored for managing scarcity under democracy. The irony of today’s political nostophobia is that there’s no path to a more prosperous, harmonious future across the liberal West without a return to the 1970s—without a relitigation of the contest between democracy and the market that the decade of Ford and Carter decided, to our collective detriment, in favor of money over the people. The 1970s were a missed opportunity. The 2020s need not repeat that decade’s mistakes.