Why Corporate America Gave Up on R&D
American companies used to be at the cutting edge of science and technology. Not anymore.
American businesses have had a long history of being at the cutting edge of technology. Corporate labs at GE, DuPont, and AT&T’s Bell Labs were responsible for significant advances in science and technology in the 20th century, leading the development of innovations like integrated circuits, plastics, and synthetic fibers, which in turn became strong drivers of economic growth.
But are American businesses still as innovative as they used to be? A recent paper by professors at Duke University’s Fuqua School of Business argues that a transition away from formal corporate research toward a more diffuse innovation ecosystem driven by startups and universities has led to a decline in American innovation and economic growth.
Marker spoke with Fuqua professors Ashish Arora and Sharon Belenzon about the death of corporate research labs, how innovation works in the new ecosystem, and why it’s become harder to solve complex problems, including those posed by Covid 19.
This interview has been edited and condensed for clarity.
Marker: What has changed about innovation in American business in recent decades?
Ashish Arora: The research investments of the early 20th-century American industry were small, uneven, and concentrated in large firms. To some extent this reflected the weakness of the American university system, which at that time was fairly utilitarian in its approach, providing services to farmers, small manufacturers, and miners in their states.
After World War II, we entered the golden age of American corporate research, with very significant investments from companies like AT&T, DuPont, GE, Kodak, and many others.
Starting in the mid-1980s, the pendulum swings back, and many of these companies start to withdraw from research, with a few exceptions, most notably Microsoft, which set up a very significant research operation. Today, there is very little corporate research, except in the fields of artificial intelligence and machine learning, where the likes of Google and Facebook have significant investments, and biopharmaceuticals to some extent. There are publicly listed biotech companies that are still investing in research, but on the whole, the American corporate sector has continued its long process of withdrawal from research.
In the broader ecosystem, however, we see American universities producing large amounts of research and large numbers of trained scientists, and we see a vibrant ecosystem with startups. This has led to a division of labor between universities and startups on the one hand, and incumbent companies using these inventions on the other. But this new ecosystem has some gaps. While there are places where this works well, there are sectors where it may not work too well.
What are those gaps?
Ashish Arora: Where I think we have trouble is with more complex innovations, like finding new semiconducting materials, for example. There might be a few startups working on that problem, but if the big companies like IBM, Intel, and Google don’t make significant investments, it’s hard to imagine that too many breakthroughs will happen. The reason is partly about scale, it’s partly the scope of the problems.
“Companies are set up mostly to produce and deliver goods and services to the marketplace, not to have activities running that have no defined deliverables, with horizons of four to six years, not six to 18 months.”
For instance, developing technological solutions to our complex climate problems will require hardware, chemistry, electronics, and more importantly, changes in regulations for implementation. Big companies deal with regulation on an ongoing basis, but it’s much harder for a VC-backed startup to base its business plan on persuading national regulators to change how they regulate devices.
Sharon Belenzon: This is the question that Ashish and I somewhat disagree on, which is: Does it make sense for companies to engage in science?I would say the answer is yes. Ashish tends to favor an ecosystem where companies and sectors are more specialized, and more aligned with different activities of the innovation process. But my interpretation of history is that the winners of the 20th century were the companies that realized very early that to be successful, they needed to be one step ahead of their competitors by developing a deep understanding of relevant natural phenomena. My concern is that American firms might lose their scientific edge, and as a consequence their dominant competitive position, if they rely only on other parts of the innovation system to develop the scientific underpinnings of their businesses.
Companies are withdrawing from research, and I don’t think you can compensate for that with science created in universities and small firms. More importantly, leading economists have argued that what makes America unique is the strong link between technology and science generated by a diverse capitalist system. The diversity of institutions that engage in R&D are key. I am worried that we are losing such diversity in return for greater efficiency through specialization.
What are the reasons behind that loss of diversity? Why do corporations no longer find it as desirable to engage in research?
Ashish Arora: There are two big factors. One is there are alternative sources of knowledge now, like the university system and startups, but the other big issue is that research is an unnatural activity inside a company. Companies are set up mostly to produce and deliver goods and services to the marketplace, not to have activities running that have no defined deliverables, with horizons of four to six years, not six to 18 months.
In the golden period of corporate research, the early successes like DuPont’s development of nylon generated a lot of goodwill. Because of those early successes, corporations agreed to keep funding them. But at some point, that goodwill runs out.
“AT&T could invest in research for a while because they were effectively a regulated monopoly, where research was part of the social bargain, but when the monopoly stopped AT&T basically had to give up on their labs.”
As a case in point, look at Microsoft. Microsoft Research was at some point a large enterprise invested in basic research, and this was because Bill Gates had a personal passion for it. After Gates, Steve Ballmer, who was still a product of that initial system, kept it going. But when Satya Nadella came in as CEO, that was the beginning of the end of Microsoft Research as a big university-like system inside Microsoft, as he wanted it to be much more focused on getting new technology into the hands of consumers.
These things have natural life cycles. Given the pressures businesses face, research activity is difficult to sustain inside a typical company. AT&T could invest in research for a while because they were effectively a regulated monopoly, where research was part of the social bargain, but when the monopoly stopped AT&T basically had to give up on their labs.
So is there a sense within these larger businesses that they don’t need their own labs since they can outsource R&D work to the startup ecosystem and the universities?
Ashish Arora: We should not call it outsourced, because outsourced implies that I give you a contract and you do it for me, but there’s certainly a sense in which that is happening.
“R and D are two very different activities. D is a standard activity with manageable endpoints. R is the problem.”
We also shouldn’t call it R&D. R and D are two very different activities. D is a standard activity with manageable endpoints. R is the problem. Research typically is somewhere between 5% to 20% of R&D, so most of R&D is D, and D is cool. In a pharmaceutical company, development means doing operations towards any clinical trials. R is studying protein forms, or investigating what the active site of a disease is. How does that translate into a medicine for Parkinson’s? That’s the development problem.
In your paper, you talk about the difference between “technical uncertainty” and “commercial uncertainty” and how it affects innovation. Can you explain that?
Ashish Arora: In the pharmaceutical industry, the principal uncertainty is, “will it work?” Will a certain molecule or vaccine work as intended? That’s mostly about technical uncertainty.
If you think about digital apps, programming the app is not the problem. But the problem is, “will people like it?” That’s mostly commercial uncertainty.
Our hypothesis is that the current system works well when you have one or the other kind of uncertainty. But if you have both uncertainties, then it gets much harder. The current division of labor doesn’t work quite as well to solve problems that have uncertainty along both dimensions.
How has this played out in the response to the coronavirus pandemic? Would you say that solutions involve a high degree of technical and commercial uncertainty?
Sharon Belenzon: Right now, in my opinion, we have insufficient scientific understanding of how different viruses behave, how they spread, how they stop, and how they interact inside our body. Clearly, from an economic standpoint, it would have made sense to invest whatever amount needed to develop such scientific understanding, as it is very unlikely this investment would be larger than the tremendous economic and human losses we have incurred.
So the real question is why haven’t we made this investment? In a nutshell, my view is that our innovation ecosystem has changed over the past three decades in a way that has made it much more dominated by profitability concerns. This is why we need governments. I am a big supporter of markets, but also realize their shortfalls. What is good for an individual firm is not necessarily good for society — and markets typically favor ventures for which there are clear applications.
Ashish Arora: I agree with Sharon that the science is hard, but the big problem is that the commercial side is very difficult. So think about testing or vaccines. Who wants to pay for a test? A lot of people who are going to fall sick don’t want to pay for the test, but we want them to get tested, so now we need to solve that commercial problem through some means. I don’t think it’s the kind of problem that can be solved with corporate research as we’ve been discussing, but I think our system of organizing innovation through the market will lead to problems like this.
Not surprisingly, in America, it was the Army that was funding malaria research for a very long time because our soldiers would go into tropical countries and fall sick to malaria.
Sharon Belenzon: This is a very good point: The government was the number one consumer of new technologies, especially in electronics.
“We have not thought enough about the difference between the government subsidizing research and actively participating in the market for downstream inventions that come from that research.”
Things changed after the Soviet Union collapsed. I think the race for a vaccine between us and China has the potential to become the next space race, but I don’t think that’s happened. Think about the prestige as well as potential life-saving consequences of bringing a vaccine to the world.
There’s lots of government support in the form of subsidies, but we have not thought enough about the difference between the government subsidizing research and actively participating in the market for downstream inventions that come from that research. Just giving you money to do whatever you want is different than asking you to set up a program that you know you have a very big customer for.
I think there’s some strong evidence in history where corporations really can take on the big challenges of science when they can rely on the government to buy whatever they find. But the markets by themselves usually are not sufficient to incentivize solving very uncertain problems.