The truth is out there: a simulation of black holes, released at a conference in February 2016
Simulating eXtreme Spacetimes

In February 2016, scientists from the Massachusetts Institute of Technology (MIT) and the California Institute of Technology, or Caltech, joined with the National Science Foundation (NSF) to share some remarkable news: two black holes 1.3 billion light-years away had collided, and the resulting gravitational waves had been “heard” by the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO). This was the first time such waves—ripples in the space-time continuum caused by the violent acceleration of massive objects—had ever been directly observed. Albert Einstein had predicted such waves a century ago, but it was long doubted that instrumentation sensitive enough to confirm their existence could ever be created. It took more than four decades of work by a vast team of scientists to make the impossible possible.

LIGO has revealed thrilling new insights into the cosmos—but it has given the world some gifts of immediate practical value as well, which help illustrate the benefits of such investments in basic science. Over the years, the LIGO project has provided a crucial training ground for thousands of top young scientists and engineers, developing talent that has energized not only American universities but also American businesses. Because LIGO researchers had to measure displacements of mirrors one-10,000th the size of a proton, they were required to invent an array of breathtakingly precise new tools, including ultrastable high-powered lasers, ultrasmooth mirrors mounted on ultraquiet vibration-isolation platforms, the world’s largest ultrahigh-vacuum system, and software algorithms for extracting tiny signals from noisy data. Some of these technologies are already beginning to be used in commercial manufacturing. And if history is any guide, LIGO will lead to important innovations far down the road—just as 1940s experiments with nuclear magnetic resonance led to the MRI scanner, a 1950s effort to create clocks to measure how gravity warps time made possible GPS, and research in the 1960s and 1970s gave the world the Internet. 

LIGO, in short, is extraordinary. But it is also typical, because it highlights the system the United States relies on to achieve great scientific discoveries: public support for university-based research, with large investments of time, cash, and patience. This support flows through federal agencies such as the NSF, the National Institutes of Health, and the Defense and Energy Departments. In the case of LIGO, its observatories were funded by the NSF and designed, constructed, and run by its university partners, with more than $1.1 billion spent over 40 years. 

It often takes decades for fundamental research to yield practical applications.

Since World War II, the U.S. government has been the world’s biggest supporter of potentially transformative science—which is a key reason why the country continues to have the highest share of knowledge- and technology-intensive industries in the world, amounting to nearly 40 percent of the economy. It often takes decades for fundamental research to yield practical applications, and those applications can be unpredictable (such as the cyclotrons devised for experiments in particle physics in the 1930s being put to use in cancer treatments now). Yet it is out of such attempts to expand human knowledge that powerful new businesses grow, with technology titans such as Apple and Google building world-class companies on the backs of technologies emerging from federal investments in research. 

By now, one successful way to cultivate economic growth in the United States is clear: Government provides the resources for basic science, and universities supply the talent, the training, and the commitment. The results inspire innovation, private investment, and further research and development, generating new products, new industries, new jobs, and better lives on a large scale.

Indeed, a short walk from my office, I can see the physical embodiment of this process in Cambridge’s Kendall Square, which has been transformed in recent decades from an aging industrial landscape. First, it became an informal gathering place for young scientists from MIT, Harvard, and Boston’s great medical centers excited by molecular medicine and gene engineering, then the site of academic research centers focused on cancer, genomics, neuroscience, and biomedicine and a hotbed for start-ups in the biosciences. Now it is a home for large companies as well, in biotechnology, pharmaceuticals, information technology, and energy. Once dominated by shuttered candy factories and empty pavement, Kendall Square has been reborn as the biotech capital of the world, one of the most innovative square miles on the planet. Much of the work on the government-funded Human Genome Project took place in the area, and according to the Battelle Memorial Institute, a nonprofit research-and-development organization, the $14.5 billion spent on that effort between 1988 and 2012 has helped generate an estimated $1 trillion in economic impact and more than four million job-years of employment. 

Yet despite the remarkable success of the U.S. innovation economy, many players in both government and industry have been pulling back from the types of bold long-term investments in fundamental science that could seed the great companies of the future. The entire innovation ecosystem is becoming more shortsighted and cautious. And by failing to invest sufficiently in basic research today, Washington risks creating an innovation deficit that may hobble the U.S. economy for decades to come. This concern has become acute since the White House released its budget blueprint, which proposes crippling cuts to science funding. Now more than ever, the fate of this crucial national investment depends on Congress.

THAT USED TO BE US

While other nations are vigorously investing in scientific discovery, in recent years, total research-and-development spending in the United States, both private and public, has stagnated. Between 2008 and 2014, the entire U.S. research-and-development enterprise grew by just over one percent annually in inflation-adjusted dollars. 

Most concerning, however, is the decline in federally supported research. Between 2009 and 2015, federal spending on research and development of all kinds decreased by nearly 20 percent in constant dollars. Universities suffered the longest downturn in federal support since the NSF began keeping track in 1972, and that has caused a great deal of promising work to stall—just when groundbreaking new tools, such as the LIGO detectors and CRISPR-Cas9 genome editing, have opened up enormous opportunities for new discoveries. 

Such underinvestment in research and development is not merely a temporary effect of the Great Recession. The federal government now spends a significantly lower percentage of GDP on research than it did in the 1960s and 1970s and has particularly stinted research in essential fields such as the physical sciences, mathematics and computer science, and the environmental sciences. The result has been a shift over time in the source of the majority of research-and-development investment from the federal government to industry. 

All six of the 2016 American Nobel laureates in science and economics were immigrants.

Industrial research and development is necessary and valuable, of course. But with some exceptions, it tends to focus on relatively narrow questions directed at specific commercial outcomes. Only about six percent of industry funding goes to basic research—to projects designed to expand humanity’s store of knowledge rather than pass tests of immediate usefulness. This is understandable. Basic research is curiosity-driven, and the short-term returns from it are often not obvious. Yet we cannot do without it, because it is from such fundamental explorations that the world gets the startling breakthroughs that create entirely new industries.

Unfortunately, the United States’ great corporate laboratories, such as Bell Labs and DuPont Central Research and Development, once hubs of both fundamental and applied science, are largely a thing of the past. As global competition intensified and firms lost their market dominance, funding such labs came to be seen as an extravagance. Since 1971, moreover, U.S. corporations have been required to report their earnings quarterly, a change that has made it more difficult for managers to focus on long-term results.

There is, however, a true bright spot in the innovation economy. A new generation of digital industry leaders is now funding applied research into various blue-sky technologies, such as low-cost space rockets, autonomous vehicles, holographic computing, Internet-beaming drones, and flying cars. Some are even taking on long-term biomedical challenges, such as devising interventions for aging. But however impressive such efforts are, one must not mistake the fruit for the tree it grew from. Even Astro Teller, the head of so adventurous a corporate laboratory as Alphabet’s X, home of the fabled “moonshots,” notes that basic research is outside his purview. “The word ‘basic’ implies ‘unguided,’” Teller told The New York Times in 2014, “and ‘unguided’ is probably best put in government-funded universities rather than industry.” Yet many of X’s futuristic projects, Teller explained, “rely on the academic work of the last 30 or 40 years.” 

Universities have struggled to do their part. Over the past 40 years, they have doubled the share of academic research-and-development spending they provide themselves, to its highest level ever. They have found the money to invest steadily in new facilities, they continue to train the nation’s young technical talent, and they continue to drive economic development, gaining ever more patents, licensing new technologies, and incubating start-ups. But budgets are tight, and university resources are too limited to sponsor basic research anywhere near the scale of LIGO.

Dr. Kip Thorne of Caltech discusses the detection of gravitational waves in Washington, February 2016.
Dr. Kip Thorne of Caltech discusses the detection of gravitational waves in Washington, February 2016.
GARY CAMERON / REUTERS

LESS MONEY, MORE PROBLEMS

Why is U.S. government funding for fundamental scientific research drying up? In part because sluggish economic growth since the end of the last economic downturn has made it difficult to justify funding projects with no projected returns for decades to come. There is also a sense that other countries will reap the profits of U.S. investment in basic research without helping cover the costs. And there is a concern that, in combination with globalization, innovation is contributing to the erosion of jobs.

But the process of scientific progress and technological change will not stop because Washington refuses to participate. Moreover, the growth of innovation clusters such as those around Silicon Valley and Kendall Square suggests that there is indeed a home-court advantage to those places where discoveries are made and that businesses like to stay physically close to the source of important ideas. In such places, start-ups linked to university-based research stay in the neighborhood to absorb talent and knowledge and are often joined by larger, more established firms. 

And although an increasing percentage of Americans worry that science is forcing too much change on them too quickly, the route to rising incomes ultimately runs through new technologies. In 1987, the MIT professor Robert Solow was awarded the Nobel Prize in Economics for an economic growth model that proposed that rising real incomes are largely dependent on technological progress. Throttling back on investment in basic research is a way to increase economic insecurity, not reduce it, and threatens to shrink the country’s horizons in several ways.

To start with, the United States’ lead in technological innovation could fall to global competition, just as the country’s domestic manufacturing base did, with major geopolitical and economic consequences. Cutting-edge science is equally vital to national security and the economy. Tellingly, other nations are already starting to catch up. As the United States’ research-and-development spending stagnated between 2008 and 2013, China’s grew by 17 percent annually, and South Korea’s, by nine percent. Chinese nationals now publish almost as many peer-reviewed scientific journal articles as Americans do, and the quality of Chinese research is rising rapidly. (For as long as the U.S. Patent and Trademark Office has been monitoring how many patents have been granted to universities, MIT has ranked as the single institution with the greatest number, followed by other distinguished U.S. universities, such as Stanford and Caltech. In 2013, Beijing’s Tsinghua University suddenly leapt ahead of Stanford.) 

Further cuts in research budgets will discourage the cultivation of desperately needed young scientific and engineering talent. This is not merely an academic issue, because a high proportion of U.S. science and engineering Ph.D.’s go into industry. As a result, universities have a significant role in training the most sophisticated talent for U.S. businesses, and a crucial feature of U.S. graduate education in science and engineering is the involvement of students in cutting-edge academic research. Projects such as LIGO show graduate students that they can pursue the boldest of ideas, leading to further innovation down the road.

The benefits of public investment in science and technology must be broadly shared by citizens.

Continuing to starve basic research will also hamper the country’s ability to attract top global talent, adding to the discouraging effect of recent restrictions on immigration. U.S. universities have long been a magnet for the world’s most brilliant people, as both students and faculty. All six of the 2016 American Nobel laureates in science and economics were immigrants, for example, as have been 40 percent of the American Nobel laureates in chemistry, medicine, and physics in this century. At MIT, more than 40 percent of both the graduate students and the faculty were born outside the United States—including the Venezuelan-born author of this article. As research funding dries up, so, too, will the influx of foreign talent.

Fewer federal dollars will also reduce the diversity of the entire U.S. research enterprise. While philanthropic support is important and can focus resources and attention on particular areas of research at particular institutions in ways that may yield rapid results, it cannot substitute for the broad base of federal investment. The National Institutes of Health alone spends over $30 billion on medical research every year; imagine how many relentlessly generous billionaires it would take to match that. Furthermore, although some philanthropic funding goes to university research, the majority of it is directed to nonprofit research institutes, which, unlike universities, are not refreshed by a steady stream of new students and junior faculty. Because universities are forever young, they are uniquely creative.

Declining public investment in science is linked to another emerging threat: a less patient system of private investment to carry discoveries through to commercialization. From the 1960s through the early 1990s, federal investments in education and research produced well-trained young scientists and engineers who generated brilliant ideas. Big companies with big internal research-and-development operations would then hire many of those people, develop their ideas, and deliver them to the marketplace. When I joined MIT’s electrical engineering faculty in 1980, that model was working well, translating discoveries from university labs across the country into market-ready innovations. 

By the 1990s, however, as American corporations curtailed their own internal research operations, scientists and engineers were left with only one avenue to bring their innovations to market: seek risk capital and launch a start-up. Venture capital investment is typically not patient, however, and it has gravitated disproportionately to digital and biotechnology start-ups that offer a quick path to profitability or to the potentially outsize rewards of blockbuster therapeutics. Venture capital investment has not worked as well for many tangible products based on new science and technology, including sorely needed new energy technologies, which may require capital-intensive infrastructure and involve novel manufacturing processes that will take time to develop.

In a Tesla factory in Fremont, California October 2011.
In a Tesla factory in Fremont, California October 2011. 
STEPHEN LAM / REUTERS

DANGER, WILL ROBINSON!

The future of U.S. scientific, technological, and economic innovation depends on increased federal funding for basic research and increased effort by the private sector to move new technologies into the marketplace. In 1964, at the height of the Cold War and the space race, federal spending on research and development came to 1.9 percent of GDP. Today it is less than half that—even in the face of threats such as terrorism, cyberattacks, climate change, and potential pandemics. Given these challenges and the ratcheting up of international competition, a recommitment to U.S. leadership in science and innovation is critical. 

Something more has to be done, also, to ensure a steady progression from ideas to investment to impact. Many universities have created incubators and accelerators to support start-ups emerging from their laboratories. At MIT, we are particularly concerned about the fate of “tough technologies” in fields such as clean energy, manufacturing, robotics, biotechnology, and medical devices—promising ideas that could potentially yield game-changing answers to enormous challenges but whose commercialization is too time- and capital-intensive to attract risk capital or strategic investment from a large corporation. To help such technologies reach the marketplace, we recently launched an enterprise we call The Engine. It will support up to 60 start-ups at a time by offering them affordable space near the MIT campus, access to specialized equipment and technical expertise, and patient capital through a venture capital investment arm relying on private funds. If this and similar projects elsewhere succeed, they could unleash waves of innovation that could benefit everyone.

The benefits of public investment in science and technology, finally, must be broadly shared by the citizens who shoulder the cost, and the economic and social disruptions triggered by the resulting advances must be addressed with systems that offer continuous training and retraining to American workers throughout their professional lives. Increasingly smart and nimble machines will eventually radically alter the workplace. Stopping such technological progress is impossible—so rather than wish the problem away, the public and private sectors should focus on helping people adapt successfully.

As soon as the world heard the first chirp signaling a gravitational wave emanating from black holes 1.3 billion light-years away, it was clear that the LIGO project was a triumph and would usher in a new kind of astronomy that would reveal new truths about the universe. LIGO shows that the United States still knows how to do truly bold science and do it well. But the breakthroughs today were built on the hard work and generous funding of past generations. If today’s Americans want to leave similar legacies to their descendants, they need to refill the research pipelines and invest more in the nation’s scientific infrastructure. If they don’t, Americans should not be surprised when other countries take the lead.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now