More than 75 years ago, confidence in the market economy got a rude shock as the world sank into the Great Depression. Adam Smith had said that the market led the economy, as if by an invisible hand, to economic efficiency and societal wellbeing. It was hard to believe that Smith was right when one in four Americans was out of a job. Some economists held true to their faith in self-regulating markets; they said, just be patient, in the long run the market’s restorative forces will take hold, and we will recover. But Keynes’s retort ruled the day: In the long run, we are all dead. We could not wait. Today, even conservatives believe that government should intervene to maintain the economy at or near full employment.
Those who believe in free markets have now received another rude shock: We have not yet sunk into an “official” recession, but it has been more than half a year since any new jobs were created, and, meanwhile, our labor force continues to grow. If the Great Depression undermined our confidence in macroeconomics (the ability to maintain full employment, price stability, and sustained growth), it is our confidence in microeconomics (the ability of markets and firms to allocate labor and capital efficiently) that is now being destroyed. Resources were misallocated and risks were mismanaged so severely that the private sector had to go running to the government for help, lest the entire system melt down. Even with federal intervention, I have estimated the cumulative gap between what our economy could have produced—had we invested in actual businesses, rather than, say, mortgages for people who couldn’t afford their homes—and what we will produce over the period of our slowdown to be more than $1.5 trillion.
Blame has rightly fallen on the financial markets because it is their responsibility to allocate capital and manage risk, and their failure has revived several old concerns of the political (and economic) left. Some looking at the U.S. economy’s decreasing reliance on manufacturing and increasing dependence on the service sector (including financial services) have long worried that the whole thing was a house of cards. After all, aren’t “hard objects”—the food we eat, the houses we live in, the cars and airplanes that we use to transport us from one place to another, the gas and oil that provides heat and energy—the “core” of the economy? And if so, shouldn’t they represent a larger fraction of our national output?
The simple answer is no. We live in a knowledge economy, an information economy, an innovation economy. Because of our ideas, we can have all the food we can possibly eat—and more than we should eat—with only 2 percent of the labor force employed in agriculture. Even with only 9 percent of our labor force in manufacturing, we remain the largest producer of manufactured goods. It is better to work smart than to work hard, and our investments in education and technology have enabled us to enjoy higher standards of living—and to live longer—than ever before. America’s dominance in so many aspects of high-tech is testimony to the real returns to these soft expenditures. Indeed, I would argue that we would do even better if we had more resources in these sectors.
But the view that our recent success is based on a house of cards has more than a grain of truth to it. In recent years, financial markets created a giant rich man’s casino, in which well-off players could take trillion dollar bets against each other. I am among those who believe that consenting adults should be allowed great freedom in what they do—as long as they don’t harm others. But there’s the rub. These high-rollers weren’t just gambling their own money. They were gambling other people’s money. They were putting at risk the entire financial system—indeed, our entire economic system. And now we are all paying the price.
FINANCIAL MARKETS HAVE BEEN LIKENED to the brain of the economy. They are supposed to allocate capital and manage risk. When they do their job well, economies prosper. When they do their job badly, as we are once again learning, everyone suffers. Financial markets are amply rewarded for their work—in recent years, they have received over 30 percent of corporate profits—and the standard mantra in economics was that these rewards were commensurate with their social return. That is, financial wizards might walk off with a great deal of money, but the rest of society is better off because our capital generates so much more productivity than in societies with less well-developed—and less rewarded—financial markets. Part of the rewards that accrue to financial markets are thus for encouraging innovation—through venture capital firms and the like.
But not all innovations enhance welfare, even when they increase profits. For instance, cigarette profits may have increased when the tobacco industry developed products that were more addictive, but those who died as a result, and their families, were hardly better off; nor were the taxpayers who had to pick up the tab for the increased health care costs. Food companies that, today, taking a page out of the same playbook, develop products that lead to compulsive eating—and the resulting obesity epidemic—may be increasing profits, but not societal well-being. Microsoft was ingenious in its strategies to leverage the monopoly power it had from controlling the PC operating system; it increased its profits, but, in killing rivals like Netscape, it had a chilling effect on innovation.
The task of unraveling all that went wrong in our financial system is a difficult one, but in essence the financial system’s latest innovation was to devise fee structures that were often far from transparent and that allowed it to generate enormous profits—private rewards that were not commensurate with social benefits. The imperfections of information (resulting from the non-transparency) led to imperfections in competition, helping to explain why the usual maxim that competition drives profits to zero seemed not to hold. One should have suspected that something was wrong when bank after bank made so much money year after year. One should have suspected that something was wrong with the economic system when millions of Americans owed billions to credit card companies and banks in “late fees,” “penalties,” and a variety of other charges, transforming a high annual interest rate of 20 percent into a truly usurious effective interest rate of 100 percent or more for those who fell behind in their payments.
Perhaps the worst problems—like those in the subprime mortgage market—occurred when non-transparent fee structures interacted with incentives for excessive risk-taking in which financial managers got to keep high returns made one year, even if those returns were more than offset by losses the next. Behind the subprime crisis were mortgages designed to encourage repeated refinancing of homes—a pyramid scheme that generated billions of dollars in fees for the mortgage company as long as home prices continued to soar. It was inevitable that the bubble would break. But, by then, the profits that had been pocketed would make these financial wizards secure for life—or, at least, that was their hope.
To put it another way, had those in the financial sector allocated capital and risk in a way that fueled the economy, they would have had handsome profits. But they wanted more, and so established incentive structures that encouraged gambling. If they gambled and won, they could walk away with a share of the profits. If they gambled and lost, the investors would bear the consequences. It was almost as if the entire financial system was converted into a giant casino in which the system was rigged to guarantee those running the games huge returns, at the expense of the players. But in Las Vegas and Atlantic City, the games are near zero-sum: The gains of the casino owners approximately equal the losses of the players. The financial-system-as-casino, on the other hand, is a negative-sum game. Those on Wall Street may have walked off with billions, but those billions are dwarfed by the costs to be paid by the rest of us. Some have lost their homes and life savings—to say nothing of their dreams for their own futures and those of their children. Others are innocent bystanders who resisted the false promises of the mortgage brokers and the credit card companies, but now find themselves out of jobs as the economy weakens. And the poor are hurt as state revenues plummet, forcing cutbacks in public services.
The current woes in America’s financial system are not an isolated accident—a rare, once-in-a-century event. Indeed, there have been more than one hundred financial crises worldwide in the last 30 years or so. Here in the United States alone, we have had the S&L crisis in 1989, the dot-com/WorldCom/Enron problems of the early years of this decade, and now the subprime-morphing-into-the-beyond-subprime collapse. In addition to these national problems, there were regional troubles—real-estate crises fed by excessive lending in Texas and the Southwest in the mid-’80s, and in California and New England in the early ‘90s. In each of these instances, financial markets failed to do what they were supposed to do in allocating capital and managing risk. In the late ‘90s, for instance, so much capital was allocated to fiber optics that, by the time of the crash, it was estimated that 97 percent of fiber optics had seen no light.
In short, the problem with the U.S. economy is not that we have allocated too many resources to the “soft” areas and too few to the “hard.” It is not necessarily that we have allocated too many resources to the financial sector and rewarded it too generously—though a strong argument could be put forward to that effect. It is that too little effort was devoted to managing real risks that are important—enabling ordinary Americans to stay in their homes in the face of economic vicissitudes—and that too much effort went into creating financial products that enhanced risk. Too much energy has been spent trying to make an easy buck; too much effort has been devoted to increasing profits and not enough to increasing real wealth, whether that wealth comes from manufacturing or new ideas. We have learned a painful lesson, both in the 1930s and today: The invisible hand often seems invisible because it’s not there. At best, it’s more than a little palsied. At worst, the pursuit of self-interest—corporate greed—can lead to the kind of predicament confronting the country today.
Joseph Stiglitz is University Professor at Columbia University, winner of the 2001 Nobel Memorial Prize in Economics, and co-author of The Three Trillion Dollar War.