In June 2024, at a national science and technology conference, Chinese President Xi Jinping said that the high-tech sector had become “the frontline and main battlefield of international competition, profoundly reshaping the global order and the pattern of development.” He is, of course, absolutely right. The United States and China compete for economic, military, and diplomatic dominance through the development of new technologies, including those with both military and civilian applications.
China is an increasingly formidable rival on this front. Since announcing the “Made in China 2025” plan in 2015, Beijing has invested in a whole-of-government focus on advancing critical emerging technologies. Now, China is giving the United States a run for its money. In the fourth quarter of 2024, the Chinese automaker BYD surpassed Tesla in sales of battery electric vehicles. In addition to being bigger than Tesla, BYD is arguably more inventive, with vehicles that can slide sideways into parking spots and float during emergencies, and chargers that can replenish up to 250 miles of range in a mere five minutes—several times faster than Tesla superchargers. The state-owned Commercial Aircraft Corporation of China also intends to rival U.S. leaders in the aerospace manufacturing field; this March, the company released plans for a long-range supersonic jet that produces supersonic booms no louder than a hairdryer. Also in March, Beijing sent quantum-encrypted images to South Africa using a small, cheap satellite—an enormous advance in quantum communications. Chinese biotech companies are competing with their U.S. counterparts in creating new drugs. And as the energy demands of artificial intelligence make fusion power—a potentially massive source of carbon-free electricity—even more desirable, China has more new public fusion projects, fusion patents, and fusion Ph.D.s than any other country.
Much of the U.S. government response to this increasing competition in recent years has been protectionist, including tariffs on electric vehicles, curbs on Chinese investments in strategic sectors, and export controls on the GPU chips and chipmaking equipment used for advanced artificial intelligence. But the success of the Chinese AI company DeepSeek, spun out of a Chinese hedge fund, has made clear that this approach is ultimately futile. In January, DeepSeek launched a high-quality AI tool that it developed without access to the enormous number of high-end GPUs thought to be required for such a model. Sooner or later, China is going to invent its way around whatever roadblocks Washington imposes.
That’s why it is so important that the United States not let up on its own innovation. When the government in Beijing decides that China must lead in a certain technology, resources are not an issue, and neither is short-term profitability. Washington, on the other hand, traditionally respects market forces and opposes government-led industrial policy. On the battlefield of technology, Americans must both continue to do what they do best and find new ways to improve competitiveness.
Since World War II, the United States has regularly created and commercialized groundbreaking technologies. But that success should not be taken for granted. Through its recent initiatives to cut federal funds for university research, the Trump administration risks draining a crucial source of new ideas for industry and the military, even as the geopolitical threats it faces continue to grow. To avert scientific and technological stagnation, the United States must significantly increase public investments in university-based research, ensure that it capitalizes on discoveries that emerge from academia, and devise sensible immigration policies that allow the world’s best students to study and then work in the United States. Right now, however, the administration seems hell-bent on damaging, rather than fostering, this crucial source of American strength.
THE MOTHER OF INVENTION
One thing the United States has done best over the past eight decades is invent foundational technologies. The wellspring of that innovation is very often U.S. research universities. Many of the most significant technologies of our day—including the Internet, the artificial neural networks that enable generative AI, quantum computing, nucleic acid sequencing, DNA amplification, CRISPR genome editing, mRNA vaccines and therapeutics, 3D printing, and checkpoint inhibitors for cancer treatment—arose from pioneering explorations in U.S. university laboratories. These university-based discoveries and inventions then led to the creation of startups and/or were taken up by existing tech companies that invested in and developed them further to bring them to market.
The best innovation tends to occur where the best science occurs. In other words, science advances knowledge, and this advanced knowledge creates new ideas, tools, and processes that enable and accelerate innovation—and that further advance knowledge. As of 2021, the United States still invested far more than any other nation in the conduct of basic scientific research. Universities were by far the largest performers of such research, and the federal government was its largest supporter. The spillover effects for the U.S. economy have been enormous.
A May 2023 analysis by the Federal Reserve Bank of Dallas found that U.S. government support for nondefense research and development has accounted for at least one-fifth of total factor productivity growth in the U.S. business sector since World War II—a far greater return than federal investments in infrastructure have yielded or than private R & D has produced. (Most industry research is inevitably more focused on narrower questions with nearer-term commercial benefits.) But despite the centrality of university-based research to the United States’ high-tech economy and the federal government’s role in such research, in recent decades government support has become increasingly lackluster. Although the dollars spent have increased in real terms, as a percentage of the federal budget, R & D has fallen from over ten percent in the mid-1960s, when the United States was competing with the Soviet Union, to a meager three percent today, when the United States is facing a much more adept competitor in China. And under the current administration, the funding devoted to research is likely to be cut dramatically.
As the federal government’s share of academic research funding has declined—from 61 percent in 2012 to 55 percent in 2021—U.S. universities have increased the share of their own funds spent on research, including endowment income, from 21 percent in 2012 to 25 percent in 2021. But income from even the largest endowment cannot replace the loss of federal funds to academic R & D, which amounted to nearly $60 billion in fiscal year 2023. In 2021, the United States ranked 23rd among 32 nations reporting to the Organization for Economic Cooperation and Development in terms of academic spending on R & D as a percentage of GDP.
The 2022 CHIPS and Science Act was designed to correct some of this underinvestment, with $200 billion authorized for R & D and workforce and economic development. The budget of the National Science Foundation, which supports nonmedical academic research in the United States, was supposed to double by 2027. Instead, Congress never fully appropriated the funds, and the agency’s budget was cut in 2024 and kept flat this year.
China, in contrast, announced earlier this year a ten percent increase to its central government science and technology spending and an increased focus on basic research. Many of Beijing’s political leaders earned degrees from Tsinghua University, often referred to as “China’s MIT.” These officials understand science and technology and its impact on all else. As a result, Chinese leaders view universities as key to the country’s “national rejuvenation” and technological self-reliance, and they have tripled the country’s number of higher education institutions since 1998. Over the past two decades, China has produced more Ph.D.s in STEM fields than the United States, and in 2016, China exceeded the United States in research publications for the first time. China is not merely increasing the scale of its inputs to innovation but also their quality. In the 2016 Nature Index, which tracks scientific output, five of the world’s top ten academic institutions generating high-quality research were American and one was Chinese. In the most recent index, from 2024, the roles had reversed: eight of the world’s top ten were Chinese and two were American.
LOSING CRITICAL MASS
Today, the Trump administration is allowing scientific discovery and technological innovation to become collateral damage amid a culture war on universities. Vice President JD Vance has explained the political impetus for upending U.S. universities very clearly in a February 2024 interview with The European Conservative: “We should be really aggressively reforming them in a way to where they’re much more open to conservative ideas.” But is the perceived liberal bent of universities a reason to sow chaos in a research system that is key to U.S. national competitiveness? If a researcher can find a way to prevent cancer or Alzheimer’s, it should not matter whether they are conservative or liberal.
In just a few months in office, the Trump administration has already managed to inflict a remarkable amount of damage on the country’s research enterprise—damage that will have lasting effects. This includes hollowing out research agency staffs and freezing the process by which grants are awarded. The administration has also canceled already-awarded grants deemed to be in violation of executive orders, such as those related to gender identity or diversity, equity, and inclusion, or at disfavored institutions such as Columbia University. Most sweeping are the structural changes in the funding system for university research. The Trump administration has tried to cap reimbursement at the National Institutes of Health, the Department of Energy, and the National Science Foundation for the indirect costs of research—including the costs of maintaining and operating buildings and providing information infrastructure for laboratories—at 15 percent, which does not reflect the real costs shouldered by leading universities. Although the courts have halted this policy change thus far, if the administration is able to proceed, the country’s greatest research universities will be severely harmed. Trump’s proposed 2026 budget would starve U.S. science, including by cutting the budget of the National Institutes of Health by about 40 percent and that of the National Science Foundation by roughly 57 percent. Proposals to tax university endowment income at 14 or 21 percent—or to take away universities’ tax-exempt status—would hobble those universities hoping to make up some of the difference with their own funds.
The Trump administration is imposing costs not only on universities’ budgets but also on their recruitment. The United States has long benefited from an enormous brain gain, with the most talented scientists and engineers around the world coming to U.S. research universities to teach and to learn. But with its funding cuts, academic censorship, and hostile immigration policies, the Trump administration is provoking a brain drain. Three-quarters of the respondents to a recent poll of U.S. researchers by the journal Nature said that they were considering leaving the United States because of the Trump administration disruptions to science. European universities are now gladly recruiting that U.S. scientific talent. Research centers in cities including Barcelona and Madrid are reporting dozens of applications from U.S. scientists. Promising and distinguished researchers of Chinese origin in fields essential to U.S. competitiveness—artificial intelligence, robotics, mathematics, and nuclear fusion—are leaving leading U.S. research universities to return to China. This outflow is an acceleration of the exodus of Chinese-born scientists that began during the first Trump administration, when U.S. academics of Chinese descent were targeted unfairly for prosecution by the Department of Justice’s China Initiative.
The best innovation tends to occur where the best science occurs.
Freezes and cuts in research funding have also had an immediate impact on the next generation of talent. Research universities are limiting the number of graduate students they admit and postdoctoral researchers they hire and are even rescinding offers they already made. The National Science Foundation has cut the number of graduate fellowships it offers in half. In a survey of postdoctoral researchers conducted by the National Postdoctoral Association at the end of the first six weeks of Trump’s second term, 43 percent of respondents said that their position was “threatened,” and 35 percent said that their research was “delayed or otherwise in jeopardy.” Some of these young people may be driven out of science entirely.
The detention and potential deportation of international graduate students and the revocation of student visas, sometimes without explanation, is likely to make the United States a much less desirable destination for the world’s best students and therefore weaken American leadership in emerging technologies. Nationwide, international students earn 64 percent of doctorates in computer and information sciences, 57 percent of those in engineering, and 54 percent of those in mathematics and statistics. The United States clearly could do a better job of developing homegrown talent for these fields, but it is important to recognize how much the country gains by attracting brilliant people from around the world. The overwhelming majority of international doctoral students educated in the United States intend to stay on in the United States after earning their degrees, including more than three out of four doctoral recipients from China. And these students contribute to the U.S. economy; the National Foundation for American Policy’s most recent analysis found that 25 percent of U.S. billion-dollar startup companies have a founder who came to the country as an international student. But increasingly, the best international students have other choices. Tsinghua University and Peking University are now 12th and 13th, respectively, on the Times Higher Education world university rankings. Peking is rated first in the world for its AI research output, and Tsinghua is second.
Fraught confrontations at U.S. borders are now reportedly making foreign scientists hesitate before coming to scientific conferences. In March, for example, a French scientist who works in space research was detained and denied entry when attempting to travel to a conference near Houston. U.S. officials claimed that he was turned away because he had confidential information from Los Alamos National Laboratory, but French officials said that he was denied because his phone contained messages critical of the Trump administration’s science policies. If the United States cannot even convene the world’s best scientists, it will struggle to preserve the open exchange and free inquiry that it has championed for so long—and that science thrives on.
The Trump administration seems to be taking U.S. leadership in science and technology for granted. Doing so would be a dangerous mistake. Americans are accustomed to U.S. companies delivering astonishing innovations with regularity, including the iPhone, cloud computing, the Tesla Model S, and ChatGPT. And there are certain aspects of U.S. history and culture that have encouraged inventiveness and risk-taking. But the United States did not become the world’s leading scientific and technological superpower because its people are somehow innately smarter and more creative than those in the rest of the world. It became a leader because it has had the world’s best system for science and innovation—a system that is now under attack from the Trump administration.
ENGINES OF GROWTH
The modern research university is a German invention, dating back to the early nineteenth century, when the intellectual founders of the Humboldt University of Berlin argued for the linking of teaching and research, for expanding academic freedom, and for the idea that research should be pursued without a view of immediate utility. They believed that the state should support such explorations—but not direct them. This model was so appealing that, in the nineteenth century, it was Germany that welcomed the world’s best and brightest: about 10,000 U.S. scholars earned their doctorates or other advanced degrees in German universities. Some of those German-educated Americans founded the first U.S. research university, Johns Hopkins, in 1876.
The Massachusetts Institute of Technology, the university where I work and served as president, began operating in 1865 on a different model, the polytechnic model. It focused on applied science and engineering rather than on theory, aiming to produce technically trained graduates for a young, industrializing country. MIT did not initially have the funds or the interest to do much research, but in the early years of the twentieth century, after taking an academic tour through Germany, MIT President Henry Pritchett returned with the conviction that MIT should do more than teach. He established the university’s first major research laboratories.
MIT quickly became a powerhouse in applied research done to benefit U.S. industry and society. And it was often conducted in partnership with the leading U.S. companies at that time, including AT&T and General Electric. The federal government was not yet in the business of funding university research. Although industry support was a matter of financial survival, the experience of working with industry made MIT particularly adept at moving its inventions into the marketplace. By the 1920s, MIT’s leaders and alumni began to worry that a commercial focus was limiting the university’s reach. In 1930, the institute recruited as its president the nuclear physicist Karl Compton. Compton had argued in 1927 that university research should not be focused merely on industry; he believed that universities were, in fact, the only places where pure science could be investigated without the pressures of commercialization. He also advocated that such research should be funded by taxes on any enterprise that profited from science.
It was not until World War II that the federal government devoted a large quantity of tax dollars to university-based research, thanks to the leading engineer Vannevar Bush. Bush, while on the faculty at MIT, designed and built a pioneering analog computer. He also had a fantastic mind for science policy. After Germany invaded Poland in 1939, Bush persuaded U.S. President Franklin Roosevelt to create an organization to oversee research of interest to the military. In 1940, Roosevelt established the National Defense Research Committee with Bush as its leader. (Its power later expanded as it became the Office of Scientific Research and Development, with Bush still in charge.) MIT President Compton, a member of the committee, was put in charge of identifying technologies to detect German aircraft and ships—which led to the founding of the Radiation Lab at MIT. Named to deceive the Nazis, the laboratory was not focused on radioactive materials but on microwave radar systems—a technology that was arguably more important for the outcome of the war than even the atom bomb. Leading scientists, a number of whom would go on to win Nobel Prizes, were recruited from around the country to the Rad Lab. Over the next five years at MIT, more than 100 radar systems were developed to counteract the threat of German U-boats and V-1 flying bombs.
With federal funding, universities across the country were able to devote time and effort to wartime projects. The University of Chicago, Columbia, and the University of California, Berkeley, conducted initial research for the Manhattan Project, and many leading universities lent their talent to it. The California Institute of Technology worked on rocketry. Harvard researched how to use sonar against Nazi submarines and how to muffle noise in long-range bombers, contributing to the development of fiberglass. In 1942, Johns Hopkins launched its Applied Physics Laboratory, which developed the proximity fuse—another crucial technological innovation for the Allied victory—and which later, during the Sputnik era, developed the concept for GPS.
After Germany surrendered, Bush presented a landmark report to U.S. President Harry Truman titled Science: The Endless Frontier, in which he argued for the continuation of federal support for university-based research. Bush cited how decisive government-funded science had been to the Allied victory, including the development of scalable penicillin production, saving lives. And, as he pointed out, the United States could no longer rely on a “ravaged Europe” for fundamental discovery science as it had in the past. Bush’s argument for the peacetime federal funding of research and scientific education was not only about national security and public health but also about economic growth. As he saw it, by continuing to “study nature’s laws,” the United States could create new manufacturing industries and expand old ones—an assessment that turned out to be prescient. Just over a decade later, the MIT economist Robert Solow published groundbreaking research establishing that modern economic growth depends on technological advancements and not exclusively on capital and labor, as the classical paradigm had held—work for which he would win a Nobel Prize.
Bush had a linear view of innovation: the federal government would give universities funds for basic research projects inspired by curiosity and not profit. These projects trained brilliant students and produced discoveries; industry would then develop those discoveries and find practical applications for them.
Although Bush’s ideas were not implemented precisely as he envisioned, the government continued to provide federal support for university research after the war, helping turn the United States into the world’s dominant scientific and technological power, producing a unique concentration of world-class research universities, and making the country a magnet for the best STEM talent from around the world. Under this model, leading U.S. research universities both supported existing industries and became hotbeds of entrepreneurship themselves. In 2011, a study at Stanford University calculated that since the 1930s, its alumni and faculty had started nearly 40,000 companies that employed 5.4 million people and generated $2.7 trillion in annual revenues, putting Stanford among the world’s ten largest economies. A similar MIT analysis found that as of 2014, MIT’s living alumni had founded over 30,000 companies that employed 4.6 million people and generated nearly $2 trillion in annual revenues. The Bush model could certainly use some updating—and expansion—for a world in which China is pulling ahead of the United States in science and technology. But it is hard to see anything in this model that demands to be torn down.
KEEPING THE LEAD
Most Americans today were born long after Bush and The Endless Frontier. Throughout their lives, the United States has dominated in science, technology, industry, innovation, and culture, and they may assume that this is the natural order of things. But if the United States can no longer afford to conduct the productive explorations enabled by government investment, it will lose the technological race with China. Its military will suffer, because it depends on technologically advanced commercial products that the defense market alone cannot support. It will also see less growth among high-tech entrepreneurs. A 2021 study published by the National Bureau of Economic Research confirmed the strong connection between increases in federal research support for a university and the formation nearby of startups with significant potential for growth. Another paper, published in 2020, detailed that university researchers who experienced large cuts in federal research funding are 80 percent less likely to launch a high-tech startup.
Shoring up U.S. leadership in scientific innovation will require three things. First, the country must make public investments in university-based research that are commensurate with the geopolitical threats it faces. This should include “use inspired” basic research, which takes place at the frontiers of science but is directed toward overcoming particular obstacles in U.S. economic or national security, as was the case at MIT’s Rad Lab. Appropriating the funds already approved by Congress for the “science” part of the CHIPS and Science Act would be a terrific start. The United States will also need to design immigration policies that allow its research universities to continue attracting the world’s best science and engineering students and that allow those students to contribute to U.S. competitiveness by remaining in the country after they finish their education.
China is increasing the scale and quality of its inputs to innovation.
Finally, the country needs to do a much better job of capturing the value of the discoveries and inventions made at U.S. universities, rather than allowing a lack of patient capital to drive critical technology manufacturing and development elsewhere. For example, a company named A123—spun out of the MIT Professor Yet-Ming Chiang’s laboratory in 2001—was the first to commercialize a superior lithium-ion battery chemistry for electric vehicles. But because the U.S. market for EVs was not sufficiently developed for the company to become profitable, A123 declared bankruptcy in 2012 and was bought by a Chinese auto parts company. Today, China dominates lithium-ion battery manufacturing. This is a foundational enabling technology of the kind the United States should have supported until it could stand on its own in the marketplace. At MIT, about a decade ago, I became deeply concerned about this very issue—that some of the most groundbreaking science-based inventions emerging from our laboratories, despite their huge potential to benefit society, were not advancing to commercialization. The timeline to market for risky new technologies in fields such as regenerative medicine and clean energy is simply too long for most private investors—there is a reason that 39 percent of U.S. venture dollars go to software startups and just two percent go to energy startups. So MIT decided to create a combination accelerator and patient venture capital fund, called “The Engine,” for such “tough tech.” The Engine offers initial support to startups that have included Commonwealth Fusion Systems, a company that uses high-temperature superconducting magnets to develop small, low-cost fusion power systems. In December, the company announced that it will be building the world’s first grid-scale commercial fusion power plant in Virginia and expects to have it running by the early 2030s. But The Engine is just one investment organization—and the United States needs many more. Although the CHIPS and Science Act authorized the National Science Foundation to help launch similar organizations, Congress has not yet funded this endeavor.
American universities are not perfect. But many of them are extremely successful institutions by global standards, and the country depends on them. To defund universities because of faults that have nothing to do with research is to recklessly shut off the spigot to innovation.
Just as the center of gravity in science and technology moved away from Europe to the United States during the twentieth century, it can also move to Asia in the twenty-first. The economies of Japan, Taiwan, and South Korea approach or surpass the United States in the proportion of GDP they devote to R & D, and China is working hard to catch up. India, which ranks third in the number of research publications produced globally, is poised to advance in science by the end of the decade (as it also becomes the world’s third-largest economy). U.S. government policies that fail to comprehend the importance of advancing science and technology are hastening this transition.
The United States, once solidly on the frontlines of technology, is now on its way to becoming a much weaker player. And so far, it is responding to this decline by taking steps that will only weaken it further. There has never been anything inevitable about U.S. leadership in science and technology. What is inevitable is that if Washington does not work to maintain its lead on this battlefield, others will take its place.
Loading…