A Ford Model T, the first affordable automobile, in Salt Lake City, 1910.
A Ford Model T in Salt Lake City, 1910.
Wikimedia Commons

Almost seven years after the Great Recession officially ended, the U.S. economy continues to grow at a sluggish rate. Real wages are stagnant. The real median wage earned by men in the United States is lower today than it was in 1969. Median household income, adjusted for inflation, is lower now than it was in 1999 and has barely risen in the past several years despite the formal end of the recession in 2009. Meanwhile, the U.S. Federal Reserve Board and the Congressional Budget Office have taken more seriously the idea that U.S. productivity, one of the most important sources of economic growth, may stay low. And such problems are hardly unique to the United States. Indeed, productivity growth has been slow in most of the developed world for some time.

In the medium to long term, even small changes in growth rates have significant consequences for living standards. An economy that grows at one percent doubles its average income approximately every 70 years, whereas an economy that grows at three percent doubles its average income about every 23 years—which, over time, makes a big difference in people’s lives.

Some experts, such as the MIT economists Erik Brynjolfsson and Andrew McAfee, think that the current slowdown is a temporary blip and that exponential improvements in digital technologies are transforming the world’s economies for the better; others are more pessimistic. Chief among the doomsayers is Robert Gordon, a professor of economics at Northwestern University. His latest entry into this debate, The Rise and Fall of American Growth, is likely to be the most interesting and important economics book of the year. It provides a splendid analytic take on the potency of past economic growth, which transformed the world from the end of the nineteenth century onward. Gordon thinks Americans are unlikely to witness comparable advances again and forecasts stagnant productivity for the United States for the foreseeable future.

Yet predicting future productivity rates is always difficult; at any moment, new technologies could transform the U.S. economy, upending old forecasts. Even scholars as accomplished as Gordon have limited foresight.

THE GOLDEN AGE

In the first part of his new book, Gordon argues that the period from 1870 to 1970 was a “special century,” when the foundations of the modern world were laid. Electricity, flush toilets, central heating, cars, planes, radio, vaccines, clean water, antibiotics, and much, much more transformed living and working conditions in the United States and much of the West. No other 100-year period in world history has brought comparable progress. A person’s chance of finishing high school soared from six percent in 1900 to almost 70 percent, and many Americans left their farms and moved to increasingly comfortable cities and suburbs. Electric light illuminated dark homes. Running water eliminated water-borne diseases. Modern conveniences allowed most people in the United States to abandon hard physical labor for good.

In highlighting the specialness of these years, Gordon challenges the standard view, held by many economists, that the U.S. economy should grow by around 2.2 percent every year, at least once the ups and downs of the business cycle are taken into account. And Gordon’s history also shows that not all GDP gains are created equal. Some sources of growth, such as antibiotics, vaccines, and clean water, transform society beyond the size of their share of GDP. But others do not, such as many of the luxury goods developed since the 1980s. GDP calculations do not always reflect such differences. Gordon’s analysis here is mostly correct, extremely important, and at times brilliant—the book is worth buying and reading for this part alone.

A Ford Model T, the first affordable automobile, in Salt Lake City, 1910.
A Ford Model T in Salt Lake City, 1910.
Wikimedia Commons

Gordon goes on to argue that today’s technological advances, impressive as they may be, don’t really compare to the ones that transformed the U.S. economy in his “special century.” Although computers and the Internet have led to some significant breakthroughs, such as allowing almost instantaneous communication over great distances, most new technologies today generate only marginal improvements in well-being. The car, for instance, represented a big advance over the horse, but recent automotive improvements have provided diminishing returns. Today’s cars are safer, suffer fewer flat tires, and have better sound systems, but those are marginal, rather than fundamental, changes. That shift—from significant transformations to minor advances—is reflected in today’s lower rates of productivity.

Consider the history of aviation. Gordon notes that a Boeing 707 flight from Los Angeles to New York took 4.8 hours in 1958, which is actually somewhat shorter than the time it takes today. In fact, since the widespread adoption of the Boeing 707, door-to-door air travel times have increased due to the contemporary hassles involved in navigating airports and all their security. Airplanes have become much safer, but the aviation sector has been surprisingly slow to make other major technological changes. Indeed, the DC-3, a highly practical, all-purpose small plane that dates from the 1930s, still remains in use today, even in the United States.

Gordon also explores how pensions and other workplace benefits have eroded since the 1970s. For instance, the percentage of workers on a defined-benefit pension plan fell from 30 percent in 1983 to 15 percent in 2013. Gordon’s treatment of this topic is a useful rebuttal to the common claim that wage stagnation is an illusion because unmeasured benefits on the job have improved so much. The truth is that fewer workers as a percentage of the labor force now receive significant benefits from their employers.

FALSE PROPHET?

Gordon’s analysis is fascinating, but he isn’t quite able to make his startling revisionist thesis work at book length—especially a book that’s more than 750 pages. He covers a wide range of potentially interesting topics, but few of them receive much depth or cohere into a useful narrative. He discusses the Great Chicago Fire of 1871 and the San Francisco earthquake of 1906; compares automobile and maritime insurance; explains why the Homestead Act of 1862 and similar subsequent legislation passed in the nineteenth and early twentieth centuries that opened up millions of acres of land to settlers at low or no cost were politically controversial; and details the role of Philo Farnsworth of Rigby, Idaho, in developing the television set. These are all perfectly interesting set pieces, but they add little to his argument. The book could have been at least a hundred pages shorter, with no loss and some gain.

But the biggest problem with Gordon’s book is his belief that he can forecast future economic and productivity growth rates: specifically, he predicts that both will remain low in the United States. He cites the mediocre American educational system, rising income inequality, government debt, and low levels of population growth, among other factors, as unfavorable headwinds that are buffeting the U.S. economy. But although these are very real problems, there are other, more positive factors at play in the U.S. economy that Gordon is too quick to dismiss and that make predicting the future of economic and productivity growth a very difficult business.

From 1870 to 1970, the foundations of the modern world were laid.

Gordon brushes off such complexities and offers a sustained defense of growth forecasts: he assures readers that the French author Jules Verne made some pretty good predictions back in 1863 and that a December 1900 article in Ladies’ Home Journal foresaw some important aspects of the modern world, such as air conditioning and cheap automobiles. But Gordon doesn’t mention his own record as a forecaster, which is decidedly mixed. In 2000, he argued that the productivity innovations of the time didn’t measure up to the gains of the past, and the same year, he published another paper arguing that the productivity benefits of computers were not as high as many people were asserting. So far, so good.

What Gordon neglects to mention, however, is that he is also the author of a 2003 Brookings essay titled “Exploding Productivity Growth,” in which he optimistically predicted that productivity in the United States would grow by 2.2 to 2.8 percent for the next two decades, most likely averaging 2.5 percent a year; he even suggested that a three percent rate was possible. Yet 2004, just after the essay was published, was toward the tail end of the period of high productivity growth that had started in the 1990s, and since then, this number has tended to be closer to one percent. These days, Gordon is offering forecasts of not much more than one percent for labor productivity growth and below one percent for median income growth; in essence, he is chasing the trends he has observed most recently.

In the preface to his book, Gordon offers a brief history of the evolution of his views on productivity. Yet he does not mention the 2003 essay, nor does he explain why he has changed his mind so dramatically. He also fails to cite other proponents of the stagnation thesis, even though most of their work predates his book. These precursors include the economist Michael Mandel, the Silicon Valley entrepreneur Peter Thiel, and me. Mandel and I are relatively optimistic about the technological future of the United States, but we, along with most informed participants in these debates, are skeptical about our ability to forecast rates of economic and productivity growth many years into the future or, for that matter, even a few years ahead.

A GLASS HALF FULL

Ultimately, Gordon’s argument for why productivity won’t grow quickly in the future is simply that he can’t think of what might create those gains. Yet it seems obvious that no single individual, not even the most talented entrepreneur, can predict much of the future in this way.

Google's self-driving vehicle on display in Mountain View, California, 2015
A prototype of Google's self-driving vehicle in Mountain View, California, September 2015.
Elijah Nouvelage / Reuters

Consider just a few technological breakthroughs we could witness in the coming years, only a small number of which Gordon even mentions: significant new ways to treat mental health, such as better antidepressants; strong and effective but nonaddictive painkillers; artificial intelligence and smart software that could eliminate many of the most boring, repetitive jobs; genetic engineering; and the use of modified smartphones for medical monitoring and diagnosis. I can’t predict when such breakthroughs will actually happen. But it seems there is a good chance we’ll live to see some or maybe all of them materialize, and they could prove to be major advances. And although Gordon focuses on the demographic challenges the United States faces, he never considers that today, thanks to greater political and economic freedom all over the world, more individual geniuses have the potential to contribute to global innovation than ever before.

No single individual, not even the most talented entrepreneur, can predict much of the future.

It’s also worth remembering that many past advances came as complete surprises. Although the advents of automobiles, spaceships, and robots were widely anticipated, few foretold the arrival of x-rays, radio, lasers, superconductors, nuclear energy, quantum mechanics, or transistors. No one knows what the transistor of the future will be, but we should be careful not to infer too much from our own limited imaginations.

Even during Gordon’s special century of 1870–1970, progress was not evenly distributed. There were pauses, such as much of the 1920s and 1930s, between some especially fruitful periods. Some pauses in advancement today should therefore not be alarming. Gordon himself admits that information technology was producing some truly significant advances as recently as the late 1990s and the very early years of this century.

Given that economic growth and technological progress are uneven, there may well be bumps on the road when it comes to using computers to significantly improve human well-being. Surveying the array of human talent in Silicon Valley, the advances that have taken place to date, and the possible potential uses for new items such as smartphones, it is difficult to accept Gordon’s assertion that information technology has run its course. It seems much more likely that significant growth still lies ahead.

Gordon’s book serves as a powerful reminder that the U.S. economy really has gone through a protracted slowdown and that this decline has been caused by the stagnation in technological progress. But perhaps the book’s greatest contribution to the debate over the world’s economic future is that it unintentionally demonstrates the weakness of the case for pessimism.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now