"Something more will have to be cut": graffiti in Seville, Spain, November 2012.
Reuters / Marcelo Del Pozo

Unable to take constructive action toward any common end, the U.S. Congress has recently been reduced to playing an ongoing game of chicken with the American economy. The debt-ceiling debacle gave way to the “fiscal cliff,” which morphed into the across-the-board cuts in military and discretionary spending known as “sequestration.” Whatever happens next on the tax front, further cuts in spending seem likely. And so a modified form of the austerity that has characterized policymaking in Europe since 2010 is coming to the United States as well; the only questions are how big the hit will end up being and who will bear the brunt. What makes all this so absurd is that the European experience has shown yet again why joining the austerity club is exactly the wrong thing for a struggling economy to do.

The eurozone countries, the United Kingdom, and the Baltic states have volunteered as subjects in a grand experiment that aims to find out if it is possible for an economically stagnant country to cut its way to prosperity. Austerity -- the deliberate deflation of domestic wages and prices through cuts to public spending -- is designed to reduce a state’s debts and deficits, increase its economic competitiveness, and restore what is vaguely referred to as “business confidence.” The last point is key: advocates of austerity believe that slashing spending spurs private investment, since it signals that the government will neither be crowding out the market for investment with its own stimulus efforts nor be adding to its debt burden. Consumers and producers, the argument goes, will feel confident about the future and will spend more, allowing the economy to grow again.

In line with such thinking, and following the shock of the recent financial crisis, which caused public debt to balloon, much of Europe has been pursuing austerity consistently for the past four years. The results of the experiment are now in, and they are equally consistent: austerity doesn’t work. Most of the economies on the periphery of the eurozone have been in free fall since 2009, and in the fourth quarter of 2012, the eurozone as a whole contracted for the first time ever. Portugal’s economy shrank by 1.8 percent, Italy’s fell by 0.9 percent, and even the supposed powerhouse of the region, Germany, saw its economy contract by 0.6 percent. The United Kingdom, despite not being in the eurozone, only barely escaped having the developed world’s first-ever triple-dip recession. 

The only surprise is that any of this should come as a surprise. After all, the International Monetary Fund warned in July 2012 that simultaneous cuts to state spending across interlinked economies during a recession when interest rates were already low would inevitably damage the prospects for growth. And that warning came on top of the already ample evidence that every country that had embraced austerity had significantly more debt than when it started. Portugal’s debt-to-GDP ratio increased from 62 percent in 2006 to 108 percent in 2012. Ireland’s more than quadrupled, from 24.8 percent in 2007 to 106.4 percent in 2012. Greece’s debt-to-GDP ratio climbed from 106 percent in 2007 to 170 percent in 2012. And Latvia’s debt rose from 10.7 percent of GDP in 2007 to 42 percent in 2012. None of these statistics even begin to factor in the social costs of austerity, which include unemployment levels not seen since the 1930s in the countries that now make up the eurozone. So why do governments keep on treading this path?

Austerity became and remains the default policy response to the financial crisis in the eurozone for both material and ideological reasons. Materially, this is because there have been few other easily available policy options. Unlike the United States, which was able to bail out its banks in 2008 because it had a central Treasury and a central bank able to accept any sort of collateral it wanted, the EU had to prop up its own failing banking system (which was three times as large and twice as leveraged as the U.S. banking system) with little more than some additional liquidity, spending cuts, and incantations of its “unshakable commitment to the euro.” The U.S. banking system has shed its debt and recapitalized, and it is now ready to grow. The EU, given its institutional makeup, has not even been able to start that process. As a result, the eurozone economies continue to contract, in spite of the increasingly dubious promise that confidence is returning. 

Ideologically, it is the intuitive appeal of the idea of austerity -- of not spending more than you have -- that really casts its spell. Understanding how austerity came to be the standard policy in liberal economic thought when states get into trouble can reveal why it is so seductive and so dangerous.

HOLEY WRIT

Austerity is a seductive idea because of the simplicity of its core claim -- that you can’t cure debt with more debt. This is true as far as it goes, but it does not go far enough. Three less obvious factors undermine the simple argument that countries in the red need to stop spending. The first factor is distributional, since the effects of austerity are felt differently across different levels of society. Those at the bottom of the income distribution lose proportionately more than those at the top, because they rely far more on government services and have little wealth with which to cushion the blows. The 400 richest Americans own more assets than the poorest 150 million; the bottom 15 percent, some 46 million people, live in households earning less than $22,050 per year. Trying to get the lower end of the income distribution to pay the price of austerity through cuts in public spending is both cruel and mathematically difficult. Those who can pay won’t, while those who can’t pay are being asked to do so.

The second factor is compositional; everybody cannot cut their way to growth at the same time. To put this in the European context, although it makes sense for any one state to reduce its debt, if all states in the currency union, which are one another’s major trading partners, cut their spending simultaneously, the result can only be a contraction of the regional economy as a whole. Proponents of austerity are blind to this danger because they get the relationship between saving and spending backward. They think that public frugality will eventually promote private spending. But someone has to spend for someone else to save, or else the saver will have no income to hold on to. Similarly, for a country to benefit from a reduction in its domestic wages, thus becoming more competitive on costs, there must be another country willing to spend its money on what the first country produces. If all states try to cut or save at once, as is the case in the eurozone today, then no one is left to do the necessary spending to drive growth.

The third factor is logical; the notion that slashing government spending boosts investor confidence does not stand up to scrutiny. As the economist Paul Krugman and others have argued, this claim assumes that consumers anticipate and incorporate all government policy changes into their lifetime budget calculations. When the government signals that it plans to cut its expenditures dramatically, the argument goes, consumers realize that their future tax burdens will decrease. This leads them to spend more today than they would have done without the cuts, thereby ending the recession despite the collapse of the economy going on all around them. The assumption that this behavior will actually be exhibited by financially illiterate, real-world consumers who are terrified of losing their jobs in the midst of a policy-induced recession is heroic at best and foolish at worst.

Austerity, then, is a dangerous idea, because it ignores the externalities it generates, the impact of one person’s choices on another’s, and the low probability that people will actually behave in the way that the theory requires. To understand why such a threadbare set of ideas became the Western world’s default stance on how to get out of a recession, we need to consult a few Englishmen, two Scots, and three Austrians.

A LIBERAL TENSION

Austerity’s origins lie in a tension within liberal economic thinking about the state. In the second of his Two Treatises on Government, the seventeenth-century English political theorist John Locke accepted the inevitability of inequality stemming from the invention of money and private property. But having done so, he also had to acknowledge the need for a state to police the inequities that the market produces. Any state that could do this effectively, however, would also be strong enough to threaten the property holders it was meant to protect. And so a tension was born in the heart of liberalism: you can’t live with the state, since it might rob you, but you also can’t live without it, since the mob might kill you. Later, when the eighteenth-century Scottish thinkers David Hume and Adam Smith turned to this tension, it took on another dimension: how to pay for the state that you fear but nonetheless need. The solution seemed to be government debt, but neither Hume nor Smith liked that answer.

As Hume and Smith each noted, the government could borrow money by offering merchants the chance to make less risky investments with the same, or better, level of reward through the instrument of government debt. And for these investors, buying such debt would have the handy upside of funding the state they needed without their having to pay taxes. Indeed, the state pays them to fund it. But the problem with this free option is that it is not really free. In order to find buyers for its debt, the state must offer better returns than those offered on other investments, and by offering such terms, money is diverted away from market-driven investments toward inherently wasteful government spending. This process ends up reducing growth, increasing interest rates, and leaving the state indebted, first to local merchants, and then to foreigners. Rather than solve the problem of how to pay for the state, this process leads to perpetually increasing taxes and, as Smith warned, the inevitable ruin of the lender, as “the idle and profuse debtor [earns] at the expense of the industrious and frugal creditor, . . . transporting a great part of the national capital . . . to those which are likely to dissipate and destroy it.” Given this, Hume and Smith concluded that the poison of government debt had to be resisted at all costs, even if it seemed appealing as a short-term solution for funding the state. 

Nineteenth-century British liberal thinkers tried to resolve this tension in two different ways. Some, such as David Ricardo, sought to banish the state from the economy altogether, seeing its actions as counterproductive interventions in what was an otherwise self-equilibrating system. Yet others, such as John Stuart Mill, began to see a role for the state beyond policing inequalities. Mill went so far as to argue that government debt need not bankrupt a country and could even be used to fund useful social investments. For Mill and his ideological brethren, capitalism could not function properly in the modern world without increased state intervention. They considered the self-equilibration that Ricardo predicted unlikely because of labor agitation, business-cycle volatility, demands for suffrage, and a world of unemployment and poverty amid plenty. 

So in the twentieth century, liberalism began to split along two tracks. On one, following Ricardo, some Austrian economists, notably Joseph Schumpeter, Ludwig von Mises, and Friedrich Hayek, rejected ever more firmly the state, its interventions, and its debt. On the other, following Mill, a group of British economists, including John Hobson, William Beveridge, and, ultimately, John Maynard Keynes, made their peace with a more active and, when needed, indebted state.

THE LIQUIDATIONIST TRAP

Although a fear of the state and its debt has been hard-wired into liberalism since its inception, it was not until states emerged that were big enough to be cut that opposition to government debt became a policy fad. In the 1920s and 1930s, particularly in Austria and the United States, a growing number of economists sought to explain why real economies, in spite of their supposed tendencies toward self-equilibration, seemed to boom and bust and slump quite spectacularly. The answer given by this school of thought was that banks lent too much money, which led to the misallocation of capital to dubious investments. Eventually, and inevitably, the cheap money fueling these investments would dry up, interest rates would rise, and bankruptcies would follow. The upshot, as Andrew Mellon, U.S. treasury secretary under President Herbert Hoover, put it, is that this will “purge the rottenness out of the system. . . . People will . . . live a more moral life. . . . And enterprising people will pick up the wrecks from less competent people.” 

In short, the Austrians argued, the binge of debt financing could be cured only by the purge of austerity. The role of the state was to get out of the way and let the process unfold. “Liquidationism” -- letting failing enterprises be liquidated as a solution to economic problems -- was the name of the game, and so Washington tried it during the Great Depression. And just like today in the eurozone, it simply didn’t work. Over in the United Kingdom, a similar approach, arguing that government spending to halt a slump would merely increase debt and crowd out private investment, became known as “the Treasury view.” It, too, was put into practice, and it, too, failed, making the British economy slump even further.

It took Keynes’ General Theory, combined with the repeated failures of austerity to salvage slumping economies during the 1930s, to kill austerity as a respectable idea. The same three arguments raised above -- about distribution, composition, and logic -- were critical. Together with the practical results of the policy experiments of the 1930s and 1940s -- including the experience of World War II, which seemed to vindicate the need for and efficacy of massive government intervention into the economy -- these arguments reframed the case for austerity, and it collapsed. Why, then, did it come back with such force more than 60 years later? To answer that question, we need to move once again to the United States, where the Austrian model of booms and busts found unexpected resonance in the financial crisis of 2008, and journey from there to postwar Germany, where austerity thinking managed to survive the long Keynesian winter and give birth to the crisis response one can see in the eurozone today.

HOW THE GERMANS DID IT

One of the odd things about graduate programs in economics after the 1970s, when stagflation finally took the gloss off Keynesianism, was that one could work toward a Ph.D. in the finest schools in the United States and never take a class in money, banking, or credit. This was because in the neoclassical framework that emerged after the Keynesian heyday, money was seen to be neutral in its long-term effects (it changed neither preferences nor possibilities), while agents’ expectations were seen to be farsighted and rational. In such a happy world of self-equilibration, credit is simply one person’s deferred income transferred to another person, and banks are simply conduits for investment. The 2008 financial crisis, which revealed an actual world of hyperleveraged excessive lending, overborrowing, and willful risk blindness on the part of supposedly rational actors, came as quite a shock to this mindset. But it didn’t come as a shock to anyone still reading those austere Austrians.

The crisis seemed to play out exactly according to Mises’ and Hayek’s model of slumps: banks lent too much money, states backstopped the banks, consumers borrowed too much, and capital was misallocated, feeding an epic housing bubble from 2000 to 2007. The model carried a clear policy prescription: don’t bail out the banks. But after that had already been done and the private debt of the banking system had been over to the public balance sheet, the only thing left to do -- just as the Austrians had argued in the 1920s and 1930s -- was to cut the budget, reduce the debt, accelerate the bankruptcies of underwater businesses and individuals, and let the “enterprising people . . . pick up the wrecks from less competent people.”

Liquidationism was back, but only because economists and policymakers had forgotten the earlier arguments against it during the three-decade-long neoliberal interregnum. In a world of efficient markets and rational consumers, the type of crisis now facing the state had been deemed theoretically impossible. So when it hit, the only approach standing that took banks and booms and busts seriously was the Austrian one -- for which we can partly thank the Germans.

Given Germany’s history with inflation and deflation in the 1920s and 1930s, financial stability has always been the watchword of postwar German economics. But what has really distinguished German economic thinking is its dismissal of Keynesianism -- because the theory never made much sense to German policymakers considering the way the German economy actually functions. 

German economic growth has always been export-led. Berlin’s priorities after World War II were thus to invest in rebuilding the country’s capital stock (which meant keeping a lid on domestic consumption) and to recover export markets (which meant keeping costs, and thus wages, low). With external demand more important than internal demand, growth was determined by competitiveness and monetary stability, not domestic consumption. All government stimulus programs would do in this system is increase the costs of production and lower export demand.

This is a great economic model for a supply-side, export-led economy with a strong monetary authority and supercompetitive products. The problem is that, like the Highlander, there can be only one. Not every European country can be a Germany and run a surplus; others need to run deficits, just as for someone to save, someone else needs to spend. Unfortunately, Germany was able to design the key institutions of the EU and the eurozone in its own image, creating a strong competition authority and an extremely independent and inflation-obsessed central bank. So in the moment of the Greek crisis, Germany’s particular objection to Keynesianism was translated into the prevailing policy stance for an entire regional economy, with disastrous results. 

Germany could afford to cut its way to growth, since the sources of its growth lay outside its borders: it is the export champion of the world. But the whole of Europe cannot play that trick, especially as the Asian countries are also running surpluses. As the Financial Times columnist Martin Wolf asked, “Is everybody supposed to run current account surpluses? If so, with whom -- Martians?” The ideas that informed the institutional design of the postwar German economy and the EU may work well for Germany, but they work terribly for the continent as a whole, which cannot run a surplus no matter how hard it tries. Once again, composition matters.

To see what will happen next, we can look back to the last time this was tried on a grand scale, the 1930s, and the havoc that followed. But such history is irrelevant, critics will object, since more recent cases, in places such as Canada and Ireland in the 1980s and eastern Europe more recently, show the opposite, that austerity leads to growth. But they don’t, actually, so it is worthwhile looking at them, too.

AUSTERITY NOW, INSANITY LATER

During the 1920s and 1930s, the United States, the United Kingdom, Germany, and Japan all tried to simultaneously cut their way to growth. This project didn’t just fail; it also helped cause World War II. The U.S. economy of the 1920s was a strange beast. Agricultural prices fell, unemployment crept up, and yet the stock market boomed. Then, in 1929, it went spectacularly bust, sending tax receipts collapsing and the deficit ballooning. At this juncture, fearing that the Americans would follow the British and also abandon the gold standard, investors sent their money rushing out of the country, causing interest rates to rise and worsening the contraction. In a classic example of austerity-speak, Hoover argued that the country could not “squander itself to prosperity on the ruin of its taxpayers,” and in 1931, he proceeded to simultaneously raise taxes and cut spending. Over the next two years, unemployment shot up, from eight percent to 23 percent, and the economy collapsed -- as did the United States’ ability to act as a destination for other states’ exports. The U.S. economy did not fully recover until massive wartime spending reduced unemployment to 1.2 percent in 1944.

The situation was hardly any prettier in the United Kingdom, which had exited World War I in much worse shape than the United States. In order to grow after the war, London should have devalued the pound, which would have made its products more competitive. But since the United Kingdom was the world’s largest financial power and the linchpin of the gold standard, even a hint of devaluation would have produced panic on the exchanges, prompted a run on the pound, and caused British assets overseas to lose significant worth. Caught in this position, the United Kingdom set a high exchange rate, hoping to inspire investors’ confidence, but this had the effect of destroying British exports and thwarting the postwar recovery. So the United Kingdom stagnated, with chronically high unemployment, throughout the 1920s.

Things only got worse for the British when the United States raised interest rates in 1929 to cool the Wall Street boom and when the Young Plan for the repayment of German reparations went into effect, in 1930, giving seniority to official, over private, debts, meaning that Germany’s official debts would get paid first in the event of a bust. This mattered because previously, a lot of private American money had flowed to Germany since private money was at that time guaranteed ahead of official debts. When the Young Plan reversed debt seniority, the resulting capital flight from Europe to the United States ensured that British interest rates would remain high and stagnation would continue.

Since the United Kingdom could neither inflate nor devalue its currency, deflation -- meaning austerity -- remained the economic policy of choice, even though it was self-defeating. Despite repeated rounds of spending cuts, and despite even going off the gold standard, British debt rose from 170 percent of GDP in 1930 to 190 percent in 1933. By 1938, in real terms, British economic output was only slightly higher than where it had been in 1918. In short, the world’s two largest economies tried to cut their way to prosperity at the same time, and it only compounded their difficulties. In Germany and Japan, it led to the rise of fascism.

Germany’s woes in this period are often laid at the feet of the hyperinflation of 1923, which has become the chief domestic economic bogeyman of the era, a nightmare never to be allowed to happen again. But what this view forgets is that the hyperinflation was much more a deliberate government policy to avoid making reparations payments to France, rather than a misguided attempt at a Keynesian stimulus effort gone awry. After France’s occupation of the Ruhr in 1923, the German government began paying local workers’ wages as an act of resistance, causing the deficit to spike. The German central bank, the Reichsbank, printed money to cover the deficit, which caused the deutsche mark’s value to fall. This made reparations payments impossible, forcing a renegotiation of German debt. Soon afterward, however, the inflation was stabilized, and the country started to get back on its feet.

When the new debt repayment plan caused private U.S. money to flood out of Germany, the Reichsbank decided to raise interest rates to counter the flow, pushing the economy into a slump. At this juncture, the Center Party won the chancellorship and tried to right the fiscal ship with draconian spending cuts. But the more the government slashed, the more the Nazis gained support. In the 1930 elections, the Nazis won 18.3 percent of the vote and became the second-largest party in the Reichstag. They were, after all, the only party arguing against austerity. By 1933, as the cuts continued, they took 43.9 percent of the vote. Austerity, not inflation, gave the world National Socialism.

The Japanese government applied austerity more consistently and with more vigor than it was applied anywhere else. Following a stock market bust in 1920, several rounds of spending cuts exacerbated an ongoing deflation. The largest item on Tokyo’s budget was military spending, which was almost halved over the next decade. Japan continued to cut spending in order to get back on the gold standard, which it did in 1930 -- just as the U.S. and European economies went into free fall, killing Japan’s exports. Japan’s growth rate fell by 9.7 percent in 1930 and by 9.5 percent in 1931, while its interest rates shot up. Despite the collapse, Tokyo accelerated its spending cuts, with the military bearing the brunt. By late 1930, the military had had enough.

Following the October 1930 ratification of the London Naval Treaty, which placed limits on naval buildups, an ultranationalist group in Japan attempted to kill Prime Minister Osachi Hamaguchi (he ultimately died of his wounds). Later, in 1932, former Japanese Finance Minister Junnosuke Inoue, who had been the architect of the austerity policy throughout the 1920s, was assassinated. The finance minister of the new government, Takahashi Korekiyo, abandoned austerity, and the economy quickly began to turn around, growing at an average rate of four percent a year from 1932 to 1936. Proving that no good deed goes unpunished, however, Korekiyo was himself assassinated in 1936, along with several other civilian political figures. By 1936, the civilian government had collapsed, bringing Japan’s experiments with both democracy and austerity down with it. Japan’s imperial expansion was the result.

THAT '80s SHOW

When the world’s four largest economies all tried to cut their way to prosperity at the same time in the interwar years, the result was contraction, protectionism, violence, and fascism. Fine, some may say -- but different cases suggest different lessons. The experiences of Australia, Canada, Denmark, and Ireland in the 1980s are often held up as examples to argue for the existence of what economists call “expansionary fiscal consolidation,” when cuts supposedly lead to growth. Sadly, the facts disagree.

During the 1990s, various studies appeared to show that the fiscal consolidations that had taken place in the previous decade in Australia, Canada, Denmark, and Ireland had given the local economies a boost. All these countries cut their budgets, devalued their currencies, and controlled wage inflation, and their later growth rates were impressive. The purported mechanism behind the growth was consumers’ farsighted expectations, that is, the confidence effect: anticipating that public spending cuts now would mean lower taxes later, individuals spent their money, making these economies boom.

More recent scholarship, however, has called into question the methods of the earlier studies, the received wisdom about what actually happened in the countries in question, and the key lessons of the era. First, in each of these cases, a small state was cutting its public spending at the peak of a period of growth and when much larger trading partners were expanding. They were also discrete events, happening one country at a time, rather than simultaneous contractions.

Second, in all these cases, the main instruments of austerity were large currency devaluations and agreements with labor unions to control prices so that the devaluations’ effects were not eaten away by import inflation. That is, if a country is trying to get a boost from exporting with a cheaper currency, it does not want the cost advantage it has gained from the devaluation to be negated by wage increases, so it strikes a deal with the unions to stop that from happening. This, of course, is possible only in countries that have unionized large parts of their work forces. Given this, these cases hardly provide evidence that austerity-inspired confidence leads to growth.

Moreover, in the cases of Australia and Denmark, these so-called expansionary consolidations produced only a dead-cat bounce -- a brief but illusory resurgence. Both economies fell into severe recessions within two years of the cuts (so much for confidence). Meanwhile, in the Irish case, as the economist Stephen Kinsella has shown, real wages increased during the late 1980s, suggesting that a routine stimulus effect, not changes in consumer expectations, caused the boom. Canada, for its part, was able to cut and grow in the 1980s for the simple reason that its major trading partner, the United States, was undergoing a massive economic boom while the Canadian dollar was becoming almost 40 percent cheaper.

None of this has anything to do with expectations or confidence. Indeed, properly told, it is a Keynesian story of how devaluation and wage moderation can boost an economy when its partners are growing, and that boost, in turn, allows room for fiscal consolidation. As Keynes put it, “The boom, not the slump, is the right time for austerity.” Cuts in and of themselves do not lead to growth; they work only in small states that can export to big states that are growing. Just as countries that trade with one another cannot all simultaneously run surpluses, so interlinked economies cannot all devalue at once and expect to increase their exports.

WHY THE REBLL ALLIANCE CAN'T BLOW UP THE DEBT STAR

But wait, there’s more. Most recently, a batch of eastern European countries have been held up as role models by advocates of austerity: Romania, Estonia, Bulgaria, Latvia, and Lithuania -- call them the REBLL Alliance. They cut spending more than any other countries in Europe in 2009 and 2010 and grew faster than the rest in 2011 and 2012. Could this finally be the proof that spending cuts lead to growth? Not so fast.

The first question to ask is why there was so much cutting in the first place, and the answer is interesting. Back in the early years of this century, when these states were on the verge of becoming EU members, their bank assets looked extremely undervalued. The governments of these states, recovering from their communist pasts and eagerly embracing their capitalist futures, decided to build economic institutions that were extremely open to capital flows and friendly to foreign investment. The coming together of these two forces led 80–100 percent of the local banks to be bought by foreigners. During the liquidity crunch of 2008–9, the new Austrian, German, and Swedish parent banks decided to find the extra cash they needed by taking money from these local eastern European branches. But this meant that the eastern European countries had to watch helplessly as their money supplies flowed away.

To stanch the bleeding, an agreement was signed in Vienna in 2009 between the banks; the EU, the International Monetary Fund, and the European Commission; and Hungary, Latvia, and Romania. It committed the banks in Europe’s core to keeping their funds in their eastern European banks if the eastern European countries’ governments committed to austerity policies designed to stabilize the local banks’ balance sheets. The Vienna agreement prevented the spreading of the liquidity crunch to the rest of the REBLLs, so long as austerity was applied elsewhere in the region as well. The upshot of this agreement was that Latvian teachers and Romanian pensioners took massive income hits to guarantee the senior bondholders of European core-country banks.

But putting all this aside, has the cutting -- and there was lots of it -- been successful? In Latvia in 2009, consumption dropped by almost 23 percent and GDP fell by seven percent. In Estonia, both fell by almost 15 percent. Double-digit public-sector wage cuts became the norm across the REBLLs, causing havoc for public health, education, and social welfare programs. Yet the bounce back has been impressive, with these countries’ recovering between 60 and 80 percent of their losses from the contraction. Still, the game has not been worth the candle.

First, if the objective of austerity is to reduce debt, then all these countries, except Estonia, have failed: they have more debt today than when they started cutting. Indeed, Latvia, Lithuania, and Romania all ran much higher budget deficits at the peak of their austerity programs, in 2009–10, than did either Greece or Spain at the peak of theirs. Second, it will take until at least 2015, under the most optimistic of projections, for any of them to regain the ground lost since 2009, with the result that unemployment in these states will remain in double digits for the foreseeable future. And third, these countries did not experience any of the positive expectations or confidence that austerity is supposed to generate. According to a Eurobarometer poll, 79 percent of Latvians surveyed in 2009 thought that the economic situation in their country was bad. By 2011, when the Latvian growth rate was the highest in the EU, a full 91 percent of Latvians surveyed perceived the economic situation to be bad, and 58 percent said that the worst was yet to come. In short, as is the case with the expansionary consolidations of the 1980s, eastern Europe’s recent experience cannot be described as a victory for austerity. The REBLL Alliance has failed to blow up the Debt Star. In fact, it just made it bigger, and at enormous cost.

TOMORROW IS ANOTHER DAY

If austerity doesn’t work, what is the alternative? Spending sprees by already debt-ridden governments or a series of defaults in the developed world are hardly attractive options. But one does not have to be so ambitious. A simple rule would be Hippocratic: first, do no harm. The eurozone has consistently applied austerity, and it is now shrinking as a whole. The United States, by contrast, has not pursued austerity, and as a result it has cleaned up its balance sheets and is now able to grow. Yes, U.S. debt has increased, too, but growth eventually cures debt. Spending cuts, if they are simultaneous and large scale, will only add to the problem.

The relationship between spending cuts and debt is best captured by the economist Richard Koo’s idea of balance-sheet recessions. Countries cannot simultaneously shed their public and their private debts, which is what Europe has been doing. Rather, governments should get the private sector to pay down its debts while maintaining public spending; after all, private-sector savings need to come from somewhere. Once that is done, as the private sector recovers, tax revenues will increase, and the accumulated debts and deficits can be paid down. As noted before, getting this right is a matter of composition and timing.

The United States, meanwhile, should take advantage of the fact that it is not saddled by the kind of institutional flaws that exist in the eurozone and that it can borrow for virtually nothing. Now is a good time for Washington to make useful investments. To take just one example, around a third of the bridges in the United States are badly in need of repair. Fixing them would enhance U.S. productivity and has no downside. In this sense, austerity is not just wrong because of the problems of distribution, composition, and logic described above; it also carries a dangerous opportunity cost. If the United States were to proceed down the path to austerity, roads would go unrepaired, students would miss out on gaining knowledge, and the skills of the unemployed would atrophy. The country’s position relative to any country that did not do this would worsen. The United States would end up poorer and more debt-ridden than before, and what is most problematic, it would lack the capacities needed to generate future growth.

Those of a Schumpeterian persuasion might counter that even if government spending were cost-free, the true source of growth would still be private innovation. But as the venture capitalist William Janeway explains in his recent book, Doing Capitalism in the Innovation Economy, what makes what Schumpeter termed “creative destruction” possible is what he calls “Keynesian waste.” The U.S. aerospace industry could never have been born without massive government defense spending; recent innovations in biotechnology owe their existence to the National Institutes of Health; even the Internet was a byproduct of government research. The raw material for innovation and growth often comes from government, not private, spending.

If the United States adopted austerity, the inability of the government to generate Keynesian waste would undermine the country’s ability to grow. Cut away the state, especially at a moment when other countries around the world are busy slashing their way to prosperity, and Americans will end up much worse off than they could ever have imagined. But don’t take it on faith. Just ask the Europeans how it has been working out for them.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now