// Read more here: // https://my.onetrust.com/s/article/UUID-d81787f6-685c-2262-36c3-5f1f3369e2a7?language=en_US //
You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

How a Democracy Dies

Donald Trump’s contempt for American political institutions is only the latest chapter in a history of opportunistic attacks against them. This has been happening for decades.

We tend to think of democracies dying at the hands of men with guns. During the Cold War, coups d’etat accounted for nearly three out of every four democratic breakdowns. Military coups toppled Egyptian President Mohammed Morsi in 2013 and Thai Prime Minister Yingluck Shinawatra in 2014. In cases like these, democracy dissolves in spectacular fashion. Tanks roll in the streets. The president is imprisoned or shipped off into exile. The constitution is suspended or scrapped.

By and large, however, overt dictatorships have disappeared across much of the world. Violent seizures of power are rare. But there’s another way to break a democracy: not at the hands of generals, but of elected leaders who subvert the very process that brought them to power. In Venezuela, Hugo Chávez was freely elected president, but he used his soaring popularity (and the country’s vast oil wealth) to tilt the playing field against opponents, packing the courts, blacklisting critics, bullying independent media, and eventually eliminating presidential term limits so that he could remain in power indefinitely. In Hungary, Prime Minister Viktor Orbán used his party’s parliamentary majority to pack the judiciary with loyalists and rewrite the constitutional and electoral rules to weaken opponents. Elected leaders have similarly subverted democratic institutions in Ecuador, Georgia, Peru, the Philippines, Poland, Russia, Sri Lanka, Turkey, Ukraine, and elsewhere. In these cases, there are no tanks in the streets. Constitutions and other nominally democratic institutions remain in place. People still vote. Elected autocrats maintain a veneer of democracy while eviscerating its substance. This is how most democracies die today: slowly, in barely visible steps.

How vulnerable is American democracy to such a fate? Extremist demagogues emerge from time to time in all societies, even in healthy democracies. An essential test of this kind of vulnerability isn’t whether such figures emerge but whether political leaders, and especially political parties, work to prevent them from gaining power. When established parties opportunistically invite extremists into their ranks, they imperil democracy.

Once a would-be authoritarian makes it to power, democracies face a second critical test: Will the autocratic leader subvert democratic institutions or be constrained by them? Institutions alone are not enough to rein in elected autocrats. Constitutions must be defended—by political parties and organized citizens, but also by democratic norms, or unwritten rules of toleration and restraint. Without robust norms, constitutional checks and balances do not serve as the bulwarks of democracy we imagine them to be. Instead, institutions become political weapons, wielded forcefully by those who control them against those who do not. This is how elected autocrats subvert democracy—packing and “weaponizing” the courts and other neutral agencies, buying off the media and the private sector (or bullying them into silence), and rewriting the rules of politics to permanently disadvantage their rivals. The tragic paradox of the electoral route to authoritarianism is that democracy’s enemies use the very institutions of democracy—gradually, subtly, and even legally—to kill it.

The United States failed the first test in November 2016, when it elected a president with no real allegiance to democratic norms. Donald Trump’s surprise victory was made possible not only by public disaffection but also by the Republican Party’s failure to keep an extremist demagogue from gaining the nomination.

How serious a threat does this now represent? Many observers take comfort in the U.S. Constitution, which was designed precisely to thwart and contain demagogues like Trump. The Madisonian system of checks and balances has endured for more than two centuries. It survived the Civil War, the Great Depression, the Cold War, and Watergate. Surely, then, it will be able to survive the current president?

We are less certain. Democracies work best—and survive longer—when constitutions are reinforced by norms of mutual toleration and restraint in the exercise of power. For most of the twentieth century, these norms functioned as the guardrails of American democracy, helping to avoid the kind of partisan fights-to-the-death that have destroyed democracies elsewhere in the world, including in Europe in the 1930s and South America in the 1960s and 1970s. But those norms are now weakening. By the time Barack Obama became president, many Republicans, in particular, questioned the legitimacy of their Democratic rivals and had abandoned restraint for a strategy of winning by any means necessary. Donald Trump has accelerated this process, but he didn’t cause it. The challenges we face run deeper than one president, however troubling this one might be.

The reason no extremist demagogue won the presidency before 2016 is not the absence of contenders for such a role. To the contrary, extremist figures have long dotted the landscape of American politics, from Henry Ford and Huey Long to Joseph McCarthy and George Wallace. An important protection against would-be authoritarians has not just been the country’s firm commitment to democracy but, rather, our political parties, democracy’s gatekeepers.

Because parties nominate presidential candidates, they have the ability—and the responsibility—to keep antidemocratic figures out of the White House. They must, accordingly, strike a balance between two roles: one in which they choose the candidates that best represent the party’s voters; and another, what the political scientist James Ceaser calls a “filtration” role, in which they screen out those who pose a threat to democracy or are otherwise unfit to hold office.

These dual imperatives—choosing a popular candidate and keeping out demagogues—may, at times, conflict with each other. What if the people choose a demagogue? This is the recurring tension at the heart of the U.S. presidential nomination process, from the founders’ era through today. An overreliance on gatekeeping can create a world of party bosses who ignore the rank and file and fail to represent the people. But an overreliance on the “will of the people” can also be dangerous, for it can lead to the election of a demagogue who threatens democracy itself. There is no escape from this tension. There are always trade-offs.

For most of American history, political parties prioritized gatekeeping over openness. There was always some form of the proverbial smoke-filled room for this. In the early nineteenth century, presidential candidates were chosen by groups of congressmen in Washington, in a system known as “Congressional Caucuses.” Then, beginning in the 1830s, candidates were nominated in national party conventions made up of delegates from each state. Any candidate who lacked support among each party’s network of state and local politicians had no chance of success. Primary elections were introduced during the Progressive era in an effort to dismantle excessive party control. But these brought little change—in part because many states didn’t use them, but mostly because the elected delegates weren’t required to support the candidate who won the primary. So real power still remained in the hands of party officials and officeholders.

These “organization men” were hardly representative of American society. Indeed, they were the virtual definition of an old-boys network. Most rank-and-file party members, especially the poor and politically unconnected, women, and minorities, weren’t represented in parties’ smoke-filled rooms and so were largely excluded from the presidential nomination process. The convention system, on the other hand, was an effective gatekeeping institution. It systematically filtered out would-be authoritarian candidates. Dangerous outsiders simply couldn’t win the party nomination. And as a result, most didn’t even try.

This changed after 1968. The riotous Democratic National Convention in Chicago triggered far-reaching reform. Following presidential candidate Hubert Humphrey’s defeat that fall, the Democratic Party created the McGovern-Fraser Commission and tasked it with rethinking the nomination system. The commission’s final report, published in 1971, cited an old adage: “The cure for the ills of democracy is more democracy.” With the legitimacy of the presidential selection process at stake, party leaders felt intense pressure to transform it. As George McGovern put it, “Unless changes are made, the next convention will make the last look like a Sunday-school picnic.” If the people were not given a real say, the McGovern-Fraser report warned, they would turn to “the anti-politics of the street.”

The commission issued a set of recommendations that the two parties adopted before the 1972 election. What emerged was a system of binding presidential primaries. This meant, in principle, that for the first time, the people who chose the parties’ presidential candidates would neither be controlled by party leaders nor free to make backroom deals at the convention; rather, they would faithfully reflect the will of their states’ primary voters. Democratic National Committee Chair Larry O’Brien called the reforms “the greatest goddamn changes since the party system.” McGovern, who unexpectedly won the 1972 Democratic nomination, called the new primary system “the most open political process in our national history.” For the first time, the party gatekeepers could be circumvented—and beaten.

Some political scientists worried about the new system. Nelson Polsby and Aaron Wildavsky warned, for example, that primaries could “lead to the appearance of extremist candidates and demagogues” who, unrestrained by party allegiances, “have little to lose by stirring up mass hatreds or making absurd promises.” Initially, these fears seemed overblown. Outsiders did emerge: Civil rights leader Jesse Jackson ran for the Democratic Party nomination in 1984 and 1988, while Southern Baptist leader Pat Robertson (1988), television commentator Pat Buchanan (1992, 1996, 2000), and Forbes magazine publisher Steve Forbes (1996) ran for the Republican nomination. But they all lost.

Circumventing the party establishment was, it turned out, easier in theory than in practice. Capturing a majority of delegates required winning primaries all over the country, which, in turn, required money, favorable media coverage, and, crucially, people working on the ground in all states. Any candidate seeking to complete the grueling obstacle course of U.S. primaries needed allies among donors, newspaper editors, interest groups, activist groups, and state-level politicians such as governors, mayors, senators, and congressmen. In 1976, Arthur Hadley described this arduous process as the “invisible primary.” He claimed that this phase, which occurred before the primary season even began, was “where the winning candidate is actually selected.” Without the support of the party establishment, Hadley argued, it was nearly impossible to win the nomination. For a quarter of a century, he was right.

In June 2015, when Donald Trump descended an escalator to the lobby of Trump Tower to announce that he was running for president, there was little reason to think he could succeed where previous outsiders had failed. His opponents were career politicians and lifelong Republicans. Not only did Trump lack any political experience, but he had switched his party registration several times and even contributed to Hillary Clinton’s campaign for the Senate. His weakness among party insiders, most observers believed, would spell his demise.

By the time Trump won the March 1 Super Tuesday primaries, however, he had laid waste to Hadley’s “invisible primary,” rendering it irrelevant. Undoubtedly, Trump’s celebrity status played a role. But equally important was the changed media landscape. Trump had the sympathy or support of right-wing media personalities such as Sean Hannity, Ann Coulter, Mark Levin, and Michael Savage, as well as the increasingly influential Breitbart News. He also found new ways to use old media as a substitute for party endorsements and traditional campaign spending. By one estimate, the Twitter accounts of MSNBC, CNN, CBS, and NBC—four outlets that no one could accuse of pro-Trump leanings—mentioned Trump twice as often as Hillary Clinton. According to another study, Trump enjoyed up to $2 billion in free media coverage during the primary season. Trump didn’t need traditional Republican power brokers. The gatekeepers of the invisible primary weren’t merely invisible; by 2016, they were gone entirely.

The fact that Trump was able to capture the Republican nomination for president should have set off alarm bells. No other major presidential candidate in modern U.S. history, including Richard Nixon, had demonstrated such a weak public commitment to constitutional rights and democratic norms. When gatekeeping institutions fail, mainstream politicians have to do everything possible to keep dangerous figures away from the centers of power. For Republicans that meant doing the unthinkable: backing Hillary Clinton.

There is a recent global precedent for such a move. In France’s 2017 presidential election, the conservative Republican Party candidate, François Fillon, who was defeated in the race’s first round, called on his partisans to vote for center-left candidate Emmanuel Macron in the runoff to keep far-right candidate Marine Le Pen out of power. And in 2016, many Austrian conservatives backed Green Party candidate Alexander Van der Bellen to prevent the election of far-right radical Norbert Hofer. In cases like these, politicians endorsed ideological rivals—running the risk of angering their party base but redirecting many of their voters to keep extremists out of power.

If Republican leaders had broken decisively with Trump, telling Americans loudly and clearly that he posed a threat to the country’s cherished democratic institutions, he might never have ascended to the presidency. The hotly contested, red-versus-blue dynamics of the previous four elections would have been disrupted, and the Republican electorate would have split. What happened, unfortunately, was very different. Despite their hesitation and reservations, most Republican leaders closed ranks behind Trump, creating the image of a unified party. The election was normalized. The race narrowed. Trump won.

America’s constitutional system of checks and balances was designed to prevent leaders from concentrating and abusing power, and for most of our history, it has succeeded. Abraham Lincoln’s concentration of power during the Civil War was reversed by the Supreme Court after the war ended. Richard Nixon’s illegal wiretapping, exposed after the 1972 Watergate break-in, triggered a high-profile congressional investigation and bipartisan pressure for a special prosecutor, which eventually forced his resignation in the face of certain impeachment. In these and other instances, our political institutions served as crucial bulwarks against authoritarian tendencies.

But constitutional safeguards, by themselves, aren’t enough to secure a democracy once an authoritarian is elected to power. Even well-designed constitutions can fail. Germany’s 1919 Weimar constitution was designed by some of the country’s greatest legal minds. Its long-standing and highly regarded Rechtsstaat (“rule of law”) was considered by many as sufficient to prevent government abuse. But both the constitution and the Rechtsstaat dissolved rapidly in the face of Adolf Hitler’s rise to power in 1933.

Or consider the experience of postcolonial Latin America. Many of the region’s newly independent republics modeled themselves directly on the United States, adopting U.S.-style presidentialism, bicameral legislatures, supreme courts, and in some cases, electoral colleges and federal systems. Some wrote constitutions that were near-replicas of the U.S. Constitution. Yet nearly all the region’s embryonic republics plunged into civil war and dictatorship. For example, Argentina’s 1853 constitution closely resembled ours: Two-thirds of its text was taken directly from the U.S. Constitution. Yet these constitutional arrangements did little to prevent fraudulent elections in the late nineteenth century, military coups in 1930 and 1943, and Perón’s populist autocracy.

Likewise, the Philippines’ 1935 constitution has been described by legal scholar Raul Pangalangan as a “faithful copy of the U.S. Constitution.” Drafted under U.S. colonial tutelage and approved by the U.S. Congress, the charter “provided a textbook example of liberal democracy,” with a separation of powers, a bill of rights, and a two-term limit in the presidency. But President Ferdinand Marcos, who was loath to step down when his second term ended, dispensed with it rather easily after declaring martial law in 1972.

If constitutional rules alone do not secure democracy, then what does? Much of the answer lies in the development of strong democratic norms. Two norms stand out: mutual toleration, or accepting one’s partisan rivals as legitimate (not treating them as dangerous enemies or traitors); and forbearance, or deploying one’s institutional prerogatives with restraint—in other words, not using the letter of the Constitution to undermine its spirit (what legal scholar Mark Tushnet calls “constitutional hardball”).

Donald Trump is widely and correctly criticized for assaulting democratic norms. But Trump didn’t cause the problem. The erosion of democratic norms began decades ago.

In 1979, newly elected Congressman Newt Gingrich came to Washington with a blunter, more cutthroat vision of politics than Republicans were accustomed to. Backed by a small but growing group of loyalists, Gingrich launched an insurgency aimed at instilling a more “combative” approach in the party. Taking advantage of a new media technology, C-SPAN, Gingrich used hateful language, deliberately employing over-the-top rhetoric. He described Democrats in Congress as corrupt and sick. He questioned his Democratic rivals’ patriotism. He even compared them to Mussolini and accused them of trying to destroy the country.

Through a new political advocacy group, GOPAC, Gingrich and his allies worked to diffuse these tactics across the party. GOPAC produced more than 2,000 training audiotapes, distributed each month to get the recruits of Gingrich’s “Republican Revolution” on the same rhetorical page. Gingrich’s former press secretary Tony Blankley compared this tactic of audiotape distribution to one used by Ayatollah Khomeini on his route to power in Iran.

Though few realized it at the time, Gingrich and his allies were on the cusp of a new wave of polarization rooted in growing public discontent, particularly among the Republican base. Gingrich didn’t create this polarization, but he was one of the first Republicans to sense—and exploit—the shift in popular sentiment. And his leadership helped to establish “politics as warfare” as the GOP’s dominant strategy.

After the Republicans’ landslide 1994 election, the GOP began to seek victory by “any means necessary.” House Republicans refused to compromise, for example, in budget negotiations, leading to a five-day government shutdown in November 1995 and a 21-day shutdown a month later. This was a dangerous turn. As norms of forbearance weakened, checks and balances began to devolve into deadlock and dysfunction.

The apogee of ’90s constitutional hardball was the December 1998 House vote to impeach President Bill Clinton. Only the second presidential impeachment in U.S. history, the move ran afoul of long-established norms. The investigation, beginning with the dead-end Whitewater inquiry and ultimately centering on Clinton’s testimony about an extramarital affair, never revealed anything approaching conventional standards for what constitutes “high crimes and misdemeanors.” The Republican House members also moved ahead with impeachment without bipartisan support, which meant that Clinton would almost certainly not be convicted by the Senate (he was acquitted there in February 1999). In an act without precedent in U.S. history, House Republicans had politicized the impeachment process, downgrading it, in the words of congressional experts Thomas Mann and Norman Ornstein, to “just another weapon in the partisan wars.”

Despite George W. Bush’s promise to be a “uniter, not a divider,” partisan warfare only intensified during his eight years in office. Bush governed hard to the right, abandoning all pretense of bipartisanship on the counsel of Karl Rove, who had concluded that the electorate was so polarized by this time that Republicans could win by mobilizing their own base rather than appealing to independent voters. And with the exception of the aftermath of the September 11 attacks and subsequent military actions in Afghanistan and Iraq, congressional Democrats eschewed bipartisan cooperation in favor of obstruction. Harry Reid and other Senate leaders used Senate rules to slow down or block Republican legislation and broke with precedent by routinely filibustering Bush proposals they opposed.

Senate Democrats also began obstructing an unprecedented number of Bush’s judicial nominees, either by rejecting them outright or by allowing them to languish by not holding hearings. The norm of deference to the president on judicial appointments was dissolving. Indeed, The New York Times quoted one Democratic strategist as saying that the Senate needed to “change the ground rules ... there [is] no obligation to confirm someone just because they are scholarly or erudite.” After the Republicans won back the Senate in 2002, the Democrats turned to filibusters to block the confirmation of several appeals court nominations. Republicans reacted with outrage. Conservative columnist Charles Krauthammer wrote that “one of the great traditions, customs, and unwritten rules of the Senate is that you do not filibuster judicial nominees.” During the 110th Congress, the last of Bush’s presidency, the number of filibusters reached an all-time high of 139—nearly double that of even the Clinton years.

If Democrats abandoned procedural forbearance in order to obstruct the president, Republicans did so in order to protect him. As Mann and Ornstein put it, “Long-standing norms of conduct in the House ... were shredded for the larger goal of implementing the president’s program.” The GOP effectively abandoned oversight of a Republican president, weakening Congress’s ability to check the executive. Whereas the House had conducted 140 hours of sworn testimony investigating whether Clinton had abused the White House Christmas card list in an effort to drum up new donors, it never subpoenaed the White House during the first five years of Bush’s presidency. Congress resisted oversight of the Iraq War, launching only superficial investigations into serious abuse cases including the torture at Abu Ghraib. The congressional watchdog became a lapdog, abdicating its institutional responsibilities.

The assault on the basic norms governing American democracy escalated during Barack Obama’s presidency. Challenges to Obama’s legitimacy, which had begun with fringe conservative authors, talk-radio personalities, TV talking heads, and bloggers, was soon embodied in a mass political movement: the Tea Party, which started to organize just weeks after Obama’s inauguration.

Two threads that broke with established norms consistently ran through Tea Party discourse. One was that Obama posed an existential threat to our democracy. Just days after Obama’s election, Georgia Congressman Paul Broun warned of a coming dictatorship comparable to Nazi Germany or the Soviet Union. Iowa Tea Partier Joni Ernst, who would soon be elected to the U.S. Senate, claimed that Obama “has become a dictator.”

The second thread was that Obama was not a “real American”—a claim that was undoubtedly fueled by racism. According to Tea Party activist and radio host Laurie Roth: “We are seeing a worldview clash in our White House. A man who is a closet secular-type Muslim, but he’s still a Muslim. He’s no Christian. We’re seeing a man who’s a Socialist Communist in the White House, pretending to be an American.” The “birther movement” went even further, questioning whether Obama was born in the United States—and thus challenging his constitutional right to hold the presidency.

Attacks of this kind have a long pedigree in American history. Henry Ford, Father Coughlin, and the John Birch Society all adopted similar language. But the challenges to Obama’s legitimacy were different in two important ways. First, they were not confined to the fringes, but rather came to be accepted widely by Republican voters. According to a 2011 Fox News poll, 37 percent of Republicans believed that Obama was not born in the United States, and 63 percent said they had some doubts about his origins. Forty-three percent of Republicans reported believing he was a Muslim in a CNN/ORC poll, and a Newsweek poll found that a majority of Republicans believed Obama favored the interests of Muslims over those of other religions.

Second, unlike past episodes of extremism, this wave reached into the upper ranks of the Republican Party. With the exception of the McCarthy period, the two major parties had typically kept such intolerance of each other at the margins for more than a century. Neither Father Coughlin nor the John Birch Society had the ear of top party leaders. Now, open attacks on Obama’s legitimacy (and later, Hillary Clinton’s) came from the mouths of leading national politicians.

In recent years, the Tea Party’s extreme views have become fully integrated into the Republican mainstream. In 2010, more than 130 Tea Party–backed candidates ran for Congress, and more than 40 were elected. By 2011, the House Tea Party Caucus had 60 members, and in 2012, Tea Party–friendly candidates emerged as contenders for the Republican presidential nomination. In 2016, the Republican nomination went to a birther, at a national party convention in which Republican leaders called their Democratic rival a criminal and led chants of “Lock her up.” For the first time in many decades, top Republican figures—including one who would soon be president—had overtly abandoned norms of mutual toleration, goaded by a fringe that was no longer fringe.

HOW DEMOCRACIES DIE by Steven Levitsky and Daniel Ziblatt
Crown, 320 pp., $26.00

If, 25 years ago, someone had described to you a country where candidates threatened to lock up their rivals, political opponents accused the government of election fraud, and parties used their legislative majorities to impeach presidents and steal Supreme Court seats, you might have thought of Ecuador or Romania. It wouldn’t have been the United States of America.

But Democrats and Republicans have become much more than just two competing parties, sorted into liberal and conservative camps. Their voters are now deeply divided by race, religious belief, culture, and geography. Republican politicians from Newt Gingrich to Donald Trump learned that in a polarized society, treating rivals as enemies can be useful—and that the pursuit of politics as warfare can mobilize people who fear they have much to lose. War has its price, though. For now, the American political order and its institutions remain intact. But the mounting assault on the norms that sustain them should strike fear in anyone who hopes to see the United States secure a democratic future.

This article was adapted from How Democracies Dieby Steven Levitsky and Daniel Ziblatt, to be published by Crown, a division of Penguin Random House LLC on January 16, 2018.