ALONE AT THE TOP
It is a truth universally acknowledged that the central feature of the world at the outset of the twenty-first century is the enormous power of the United States. This country possesses the most formidable military forces and the largest and most vibrant national economy on the planet. From within its borders emanate the social and cultural trends that exercise the greatest influence on other societies. In the league standings of global power, the United States occupies first place -- and by a margin so large that it recalls the preponderance of the Roman Empire of antiquity. So vast is American superiority that the distinction bestowed upon it and its great rival, the Soviet Union, during the Cold War no longer applies. The United States is no longer a mere superpower; it has ascended to the status of "hyperpower."
The fact of American supremacy tends to polarize opinion. For those who deem such supremacy desirable, the great question of twenty-first century international politics is how to perpetuate it. On the other hand, those who regard U.S. power as unwelcome seek to discover how it can be curtailed. The undoubted fact of American supremacy, however, raises a prior question: For what purpose is all this power to be used? The proper answer to that question puts American power in a different light, and that answer derives from the singular and unprecedented character of the world in which we now live.
The contemporary world is dominated by three major ideas: peace as the preferred basis for relations among countries, democracy as the optimal way to organize political life within them, and the free market as the indispensable vehicle for producing wealth. Peace, democracy, and free markets are the ideas that conquered the world. They are not, of course, universally practiced, and not all sovereign states accept each of them. But for the first time since they were introduced -- at the outset of the period that began with the French and Industrial Revolutions and is known as the modern era -- they have no serious, fully articulated rivals as principles for organizing the world's military relations, politics, and economics. They have become the world's orthodoxy. The traditional ideas with which they contended in the nineteenth century and the illiberal ideas, embodied by the fascist and communist powers, with which they did battle in the twentieth have all been vanquished.
From these new circumstances follow the central purpose of the United States in the twenty-first century and the principal use for American power: to defend, maintain, and expand peace, democracy, and free markets. Achieving this goal, however, involves two separate tasks, and for these American power, great though it is, is not necessarily sufficient.
The first task is to sustain the international institutions and practices, concerning both security and economics, within which these three ideas can flourish. The second is to strengthen peaceful foreign policies, democratic politics, and free markets where they are not securely rooted -- above all, in Russia and China -- and install them where they do not exist at all, notably in the Arab world. For the first of these two tasks American power may prove in practice to be inadequate, and for the second that enormous power is hardly relevant.
MILITARY MISSIONS
The United States today maintains a military presence that helps keep the peace in the two regions in which a major war could be fought: Europe and East Asia. These deployments carry over from the Cold War, when their mission was to deter the Soviet Union. But even then they had a second purpose.
In Europe, the North Atlantic Treaty Organization was an instrument of "dual containment." While keeping the Soviet Union at arm's length, NATO also locked Germany into a restraining embrace, and for the same purpose: to prevent either from overturning the existing political order on the European continent. The Germans came to accept and appreciate the arrangement. It relieved them of the burden of defending themselves while at the same time dissipating the cloud of suspicion that would otherwise have enveloped them. Within the Atlantic alliance, therefore, the United States functioned as a buffer among parties with no cause for conflict but with historical reasons for mistrust. The American presence reassured each NATO member that the others harbored no aggressive intentions.
The same was true in East Asia, where the presence of American forces, and the Japanese-American security treaty, insured against a resumption of Japan's disastrous policies of the 1930s. The noncommunist Asian countries were happy to see Japan, their erstwhile conqueror, safely tucked under the wing of the United States.
With the end of the Cold War, the American presence in Europe has come to reassure all Europeans. It reassures Germany that it will not be left alone to face a potential threat, while at the same time reassuring other countries about Germany. It also reassures Western Europeans that if Russia reverts to a menacing foreign policy the United States will once again help to keep it in check, as it did during the Cold War. The American presence is especially important in reassuring Russians that their experience with German armies in the first half of the twentieth century will not be repeated, which is one reason that Soviet authorities permitted a reunified Germany to remain within the Western alliance. Similarly, in East Asia the perpetuation of U.S.-Japan defense cooperation has continued to reassure other countries, most notably China, that Japan will not conduct an independent security policy, while simultaneously reassuring the Japanese that they will not be required to do so.
The United States is not necessarily destined to sustain these roles indefinitely, however. With the Cold War over, the deployments that underwrite them have become vulnerable to the argument that the time has come for America's allies to assume responsibility for their own security, including its costs. To the extent that these allies are threatened, after all, they can defend themselves without the American military. From this point of view, the post-Cold War American presence in Europe and East Asia is an exercise in unremunerative babysitting of people who are fully grown.
One seldom-mentioned reason for maintaining an American military presence in Europe and East Asia, both during and after the Cold War, has been to keep America's allies -- above all Germany and Japan -- from feeling the need to acquire nuclear weapons. Nato and the Japanese-American security treaty have functioned as instruments of nonproliferation. And it remains an important American international aim to prevent the spread of nuclear weapons in regions beyond Europe and East Asia as well, and especially to keep them out of the hands of regimes hostile to the United States, such as Iraq and Iran.
During the Cold War, the United States negotiated, with the Soviet Union, a charter to prevent the spread of nuclear weapons: the nuclear Nonproliferation Treaty. Then and now, Americans have paid to maintain this nonproliferation system through their country's vast intelligence-gathering apparatus, which increasingly trains its eyes and ears on countries suspected of harboring dangerous nuclear ambitions. It was the United States that sounded the alarm about the North Korean nuclear weapons program in the 1990s and kept track of the nuclear progress of Saddam Hussein's Iraq. Insofar as the nonproliferation system has teeth, moreover, these too have been supplied mainly by the United States. In the war against Iraq in 1991, the American-led coalition forces took the opportunity to inflict severe damage on Baghdad's nuclear weapons program. And the United States was prepared to go to war with North Korea in 1994 over its nuclear ambitions.
As with its security presence in Europe and East Asia, through its campaign against nuclear proliferation the United States helps to sustain the conditions in which peace, democracy, and free markets can flourish. But this policy, too, is subject to domestic criticism. Americans might come to believe that the costs of preventing nuclear proliferation should be borne by those who would be more directly affected by a failure than they. After all, Iraqi or Iranian nuclear weapons would pose a far greater direct threat to the countries of the Middle East -- of which only Israel has the means to deter a nuclear attack -- and to the Europeans, who are much closer to the Middle East, than they would to the United States.
The United States plays yet a third post-Cold War military role that helps to sustain conditions favorable to the three dominant ideas of the twenty-first century: it guarantees the flow of oil from the Persian Gulf, where it is most plentiful and cheapest to extract, to oil-consuming countries around the world. It was for this purpose that the United States went to war in 1991. Had Saddam Hussein's invasion and occupation of Kuwait been allowed to stand, he would have controlled directly the considerable oil reserves of both Iraq and Kuwait and indirectly, through intimidation, the even larger petroleum deposits of neighboring Saudi Arabia.
Here, too, established American policy is not immune to domestic skepticism. Americans may wonder why their allies in the Persian Gulf do not do more to defend themselves. They might equally wonder why the countries of western Europe and Japan, which depend more heavily on oil from the gulf, do not bear a share of the burden of safeguarding that oil proportional to their interest in doing so.
The potential political fragility of these three American policies, each of which brings wide benefits to the international system as a whole, stems from the fact that each is an instance of an important feature of international politics, the common term for which comes from economics. Each is an example of a "public good."
INTERNATIONAL PUBLIC GOODS
A public good is something the benefits of which no potential consumer can be prevented from enjoying. Such goods are difficult to obtain because, for that very reason, no consumer has an incentive to pay for them. National defense and clean air and water are three examples. The mechanism that makes it possible to obtain them is government -- the state -- which, with its monopoly on the legitimate use of force, can compel people to pay for them. If there is no conductor to collect fares on a bus, everyone will ride free. But in that case there will be no funds to pay for bus service and ultimately no one will be able to ride at all. Economists have termed this inherent difficulty in providing public goods the "free rider" problem, a problem to which government is the solution.
Peace in Europe and East Asia, nuclear nonproliferation, and access to Persian Gulf oil are all public goods, but of a different sort: they are international public goods. Their "consumers" are not individuals but sovereign states, over which there is no supreme authority. Because no world government exists, international public goods are therefore difficult -- but not impossible -- to provide.
Public goods tend to be provided even without government when power and wealth are unevenly distributed within the relevant group -- optimally, when one member far surpasses all the others in both. In such cases, the powerful and the wealthy are more likely to pay the costs themselves. This is particularly likely when providing the public good in question is believed to be a matter of urgency.
Both conditions were fulfilled during the Cold War. The United States towered over all others in wealth and military might and had an urgent reason to pay the costs of international public goods: the Cold War itself. Americans considered sustaining military alliances in Europe and East Asia to be necessary for their own security, even if that meant assuming a disproportionate share of the costs of doing so.
Now, however, the Cold War is over. In its wake American political leaders have frequently invoked the need for the United States to continue to exercise global leadership. The essence of such leadership is paying a disproportionate share of the costs of international public goods. But leading the world is not only no longer as urgent as it once was, it has also ceased to seem a heroic enterprise. Leadership involves not so much marching gloriously at the head of the parade as paying quietly for the parade permit and the cleanup afterward. Leading the world means acting not as its commander in chief but as its concierge. Thus, where maintaining the framework of security within which peace -- and democracy and free markets as well -- can flourish is concerned, the question that American power raises is whether the United States will mobilize enough of it to continue, in different circumstances, the role it played during the Cold War.
Favoring continuity is the fact that the price of that role has decreased. It is no longer necessary to gird for a global conflict against a powerful adversary. The terrorist attacks on New York City and Washington of September 11, 2001, do not alter this calculus. Moreover, those attacks have partially re-created the Cold War basis for American military operations abroad. Also favoring continuity are habits ingrained over the second half of the twentieth century. Americans have become accustomed to the duties and burdens as well as the prerogatives of international leadership. Or rather, some Americans are accustomed to them, for the habits are concentrated in one sector of American society.
In the United States, as in other countries, foreign policy is the preoccupation of only a small part of the population. But carrying out any American foreign policy requires the support of the wider public. Whereas for the foreign policy elite, the need for American leadership in the world is a matter of settled conviction, in the general public the commitment to global leadership is weaker. This is not surprising. That commitment depends on a view of its effects on the rest of the world and the likely consequences of its absence. These are views for which most Americans, like most people in most countries, lack the relevant information because they are not ordinarily interested enough to gather it.
The politics of American foreign policy thus resembles a firm in which the management -- the foreign policy elite -- has to persuade the shareholders -- the public -- to authorize expenditures. This was true as well during the Cold War. But in its wake, the willingness of the public to authorize such spending requests has diminished, even though the expenditures required are smaller than in the past. And if there are reasons to expect the public to be forthcoming with the support needed for American global leadership, there are also post-Cold War currents pushing in the other direction.
The chief obstacle to an expansive American international role stems from what the country has in common with others. The United States has the same incentive to be a free rider as all other countries. The circumstances that suppressed this incentive during the second half of the twentieth century were exceptional. To the extent that Americans are reluctant to pay for international public goods, they are no different from any other people. A disinclination to pay also arises from Americans' fundamental political principles, which place a higher value on individual wishes than on collective aspirations, and greater emphasis on domestic goals than on international objectives. The Declaration of Independence, the founding document of the American republic, asserts the right to "life, liberty, and the pursuit of happiness" -- presumably individual happiness -- not the right to earn military glory or provide foreign tutelage or international stability.
International leadership imposes a kind of tax on the people of the United States, and the normal attitude toward taxation is expressed in a ditty attributed to the former chairman of the Senate Finance Committee, Russell Long (D-La.):
Don't tax you
Don't tax me
Tax that man behind the tree.
Although the terrorist attacks on New York and Washington gave the American people new and potent reasons for involving themselves heavily with the rest of the world, it is nonetheless conceivable that they will tire of being the man behind the tree, especially if the threat of terrorism fades. And the war against terrorism, whatever its scope and duration, provides little incentive to sustain the other frameworks crucial for peace, democracy, and free markets around the world: those involving the international economy.
TRADE AND MONEY
Like the American military presence in Europe and East Asia, the international economic institutions and practices for which the United States provides crucial support originated in the middle of the twentieth century. In the wake of World War II, Washington took the lead in constructing two global arteries, one for trade and the other for the international circulation of money. Each was successfully sustained, while being modified and expanded, during the Cold War. And the trade and monetary systems became, if anything, more important with the end of that conflict.
The outcome of the Cold War represented, in part, a triumph for the free market, for the conflict was, among other things, a contest of economic systems. And in that contest the Western free-market system proved decisively superior to its competitor, communist central planning. In the wake of the Cold War, although not every country conducted a peaceful foreign policy and many continued to reject democratic politics, virtually all of the world's almost 200 sovereign states sought to construct or to maintain a market economy. All saw the free market as the path to what had become, in the twenty-first century, a supreme and undisputed national goal: the creation of wealth.
The construction of a working market economy depends chiefly on the efforts of the individual societies and governments involved, but also on the vitality of the international economic order. Even as a market economy has come to be seen as the key to prosperity, participation in international markets for goods and capital has come to be regarded as part and parcel of that system. The world trading system and international capital markets have come to be seen by most countries of the world as a kind of global utility that, if they can tap into it, can power their own economies.
Maintaining this global utility is in no small part an American responsibility, and here the record of the United States in the first post-Cold War decade, and thus the portents for the future, were mixed. The most important American contribution to the trading system after World War II was to provide the largest and most open market for other countries' exports, and this continued to be the case after the fall of the Soviet Union. Moreover, free trade's champions won important political battles in the U.S. Congress, which approved not only the treaty that emerged from the Uruguay Round of global trade negotiations but also the North American Free Trade Agreement (NAFTA) and, in 2000, permanent "normal" trading status for China.
The opposition to each of these measures, however, was formidable. Such opposition, furthermore, was a telling indicator of popular disenchantment with free trade, since NAFTA and normal trade status for China were clearly advantageous for the United States in that the American market was far more open than Mexico's or China's. Political resistance to free trade is in fact inevitable, especially in a democracy. The total gains from trade are invariably greater than the total losses, as economists since David Ricardo have demonstrated, and the winners ordinarily outnumber the losers. But for each of the many winners the benefits are only modest: a slightly lower price for an imported good -- a shirt, for example -- than what one made domestically would cost. The gains are thus diffuse. The losses, by contrast, are concentrated. Only a few domestic shirt makers lose their jobs when imported shirts are cheaper, but each of them thereby loses a great deal more than any one of the benefiting consumers gains. The losers, furthermore, are acutely aware of what they have lost, whereas the winners are generally oblivious to what they have gained. As a political issue, therefore, free trade characteristically pits one side for which the stakes are very high against another that is scarcely aware that a contest is even under way.
In overcoming this resistance, the Cold War provided Washington with powerful leverage. Americans saw free trade as a bulwark against communism because it strengthened, and drew closer to the United States, America's allies, which were also its chief trading partners. The requirements of the Cold War therefore lubricated the international economic machinery established after World War II. In the 1990s, however, that lubricant ceased to be available. Its absence was evident in American trade policies. Whereas after 1945 the United States practiced free trade even when others did not, in the wake of the Cold War Washington sometimes flouted the very rules it had been instrumental in establishing. The Clinton administration tried to compel Japan to agree to numerical targets for Japanese purchases of products made in the United States, and the current Bush administration raised tariff barriers to foreign-made steel.
The prospects for global free trade in the twenty-first century thus rest on the same uncertainty as the prospects for peace in Europe and East Asia, nuclear nonproliferation, and the free flow of Persian Gulf oil: whether the United States will continue to make large contributions to the provision of international public goods.
With the international monetary system, the American record in the first post-Cold War decade was similarly mixed, and the long-term prospects for American leadership are also uncertain. At Bretton Woods, New Hampshire, in 1944, the United States, in partnership with the United Kingdom, devised a set of global monetary arrangements that featured fixed exchange rates and assigned the U.S. dollar to perform some of the roles previously played by gold. The Bretton Woods system ended in 1971, but the United States continued to play the leading role in international monetary affairs. The dollar has continued to serve as a reserve currency, the equivalent of gold, as well as the vehicle for international transactions. And in the post-Cold War period American leadership in international monetary affairs took the form of spearheading the organization, through the International Monetary Fund, of emergency loans to rescue distressed economies in East Asia. The American government feared that without a rescue effort local currency collapses would infect the entire global monetary order.
These rescue efforts proved controversial in the United States, however. Some economists found fault with the economic policies demanded of the Asian countries in return for the loans, especially the raising of interest rates to retain or re-attract foreign capital, which, these economists charged, had unnecessarily deepened the recessions into which the afflicted countries had fallen. Others called into question the very idea of offering such loans, no matter what the terms, on the grounds that they aggravated "moral hazard," the danger that rescuing investors from the consequences of bad investments will simply encourage imprudent investing. The absence of a consensus on the desirability of currency rescues casts doubt on the prospects for continuing the American role in organizing them.
The columnist Walter Lippmann once identified the central problem of foreign policy as "solvency." Where American support for the international frameworks favorable to peace, democracy, and free markets is concerned, however, the problem is rather one of "liquidity." The United States has the resources to do virtually anything -- although not to do everything. What is potentially scarce is the political will, in the form of domestic political support, to use those resources. And for the actual adoption of the institutions and practices that embody these three ideas by countries where they are not already firmly rooted, the direct use of American power is even more problematic.
THE USELESSNESS OF POWER
There is reason for optimism that, in the course of the twenty-first century, peace, democracy, and free markets will become more widely distributed and firmly rooted around the world. Virtually every country pays at least rhetorical tribute to all three, and many are actually trying to adopt them, particularly free markets. In fact, these efforts dominate the global agenda today. But the direct use of American power contributes little, if anything, to the process.
The struggle to achieve peace, democracy, and free markets involves the creation of a particular kind of state, which is the key to putting those ideas into practice. It is a state strong enough to protect property and liberty but not powerful, ambitious, or intrusive enough to suppress them. It is a state able and willing to enforce the law but not disposed to violate it. Such a state is likely to conduct peaceful foreign policies.
The construction of such a state, however, is a task for the countries themselves: others can do little to help. This was not always so. Historically, state institutions were implanted by an occupier. Thus did Roman practices spread throughout Europe in ancient times, and British institutions around the world in the modern age. But the pattern is highly unlikely to be repeated in the post-Cold War era, and will certainly not be repeated where the creation of this kind of state is most important: in Russia and China.
To be sure, the example set by the United States and the countries of western Europe and Japan carries unprecedented weight. Like individuals, collectives change by observing and learning, and as political and economic models for the rest of the world the liberal democracies have no competition. But to this process of state-building, American foreign policy has no direct contribution to make, for its tools -- guns, money, and words promising or suggesting the use of either or both -- are not effective. A version of a venerable American joke illustrates the point: How many psychiatrists does it take to change a light bulb? None -- the light bulb must want to change itself. In inducing the wish to establish peace, democracy, and free markets and the kind of state that makes them possible, the American and Western example exercises a powerful effect. But in bringing these ends about, American foreign policy is of little use.
The vocabulary of post-Cold War American foreign policy reflects the gap between what the United States seeks in and from other countries and what it has the power to accomplish. Such language is sprinkled with a particular kind of verb, which denotes the earnest intention to act without, however, conveying any particular action. Americans proclaim themselves committed to "promoting," "fostering," "encouraging," and "facilitating" democracy and free markets. These fine-sounding words have a metaphorical tendency to dissolve on contact with reality. They mean nothing in particular.
Even when aspirant societies wish to equip themselves with the appropriate kind of state, success is not assured. The capacity to develop and operate one depends heavily on whether a society's dominant values equip it for the task. In many cases they do not. So an important step in building a market economy and in making a commitment to peace and democracy is the acquisition of the appropriate cultural underpinnings.
Acquiring such a culture cannot happen overnight. The pertinent unit of time is the generation, for a shift in a society's prevailing values requires that people whose formative experiences have equipped them with one set of norms be replaced by others who have embraced, at later times and in other circumstances, different values. So although cultures do change and have always changed, and although they have changed more rapidly and broadly in the modern period than ever before, they cannot readily be changed by acts of official policy, not even by the foreign policy of the most powerful country on earth.
Thus for the diffusion of peace, democracy, and free markets -- the supreme American international goal in the twenty-first century -- the United States finds itself today in a position similar to that of Nathan Rothschild more than 150 years ago. The richest man in the world in the early decades of the nineteenth century, Rothschild died in 1837 of an infection of which the poorest Englishman could easily have been cured in the next century by readily available antibiotics. All of Rothschild's wealth could not give him what had not yet been invented, and all of the vast military and economic might of the United States cannot secure what lies beyond the power of guns to compel and money to buy.
You are reading a free article.
Subscribe to Foreign Affairs to get unlimited access.
- Paywall-free reading of new articles and a century of archives
- Unlock access to iOS/Android apps to save editions for offline reading
- Six issues a year in print, online, and audio editions