email print
Blog Post

5 questions for Tyler Cowen, Michael Strain, Catherine Tucker, and Dietrich Vollrath on escaping the great stagnation

By James Pethokoukis, Michael Strain, Tyler Cowen, Catherine Tucker, and Dietrich Vollrath

When will the great stagnation end? Based on the historical record, can we be confident that the next round of innovations will boost productivity growth? And how will these innovations impact society beyond economic statistics? Recently, I explored these questions and more in a panel discussion with Tyler Cowen, Michael Strain, Catherine Tucker, and Dietrich Vollrath.

Tyler Cowen is the Holbert L. Harris Chair of Economics at George Mason University, and he serves as chairman and faculty director of the Mercatus Center. He is the author of several books, including 2011’s The Great Stagnation: How America Ate All The Low-Hanging Fruit of Modern History, Got Sick, and Will (Eventually) Feel Better. Michael Strain is the director of economic policy studies here at AEI, as well as the Arthur F. Burns scholar in political economy. And he’s the author of The American Dream is Not Dead: (But Populism Could Kill It), released last year. Catherine Tucker is the Sloan Distinguished Professor of Management Science and Professor of Marketing at MIT’s Sloan School of Management. She is also a cofounder of the MIT Cryptoeconomics Lab and a co-organizer of the Economics of Artificial Intelligence initiative. And Dietrich Vollrath is a professor of economics and the chair of the Department of Economics at the University of Houston. He is also the author of Fully Grown: Why a Stagnant Economy is a Sign of Success, released last April.

Below is an abbreviated transcript of our conversation. You can read our full discussion here. You can also subscribe to my podcast on Apple Podcasts or Stitcher, or download the podcast on Ricochet.

Pethokoukis: Tyler, you wrote a book called “The Great Stagnation” 10 years ago. What was your argument at the time, and how does it apply to today?

Cowen: When I wrote my book in 2011, I suggested that our previous technologies had, in some regards, run their course. If you take powerful machines and fossil fuels and put them together as a kind of general-purpose technology, we did everything we could with that. But I also predicted that this stagnation would end within the next 20 years. And I think the odds are that we’re now on the cusp of another revolution based on new general-purpose technologies, which I would define as some mix of internet, computers, and computational power.

Strain: I am confident that the great stagnation is not permanent because of the innovation happening and new technologies being created. And while we haven’t found the best uses for all of those new technologies, I’m confident that at some point we will. But I’m not exactly sure when that will be — I think Tyler’s 20-year timeframe was a good estimate. And in fact, we just saw a great use of some new technology with the coronavirus vaccines, and that’ll have a huge impact on productivity.

We’ve come up with great new ideas in the past, but it’s always taken us a little while to figure out how to use them throughout the economy, and eventually they made big differences. So is the reason for this optimism mostly because that’s how it’s always worked before?

Tucker: Yes, and digital economists have spent the last 25 years trying to excuse why all the technologies we study have been everywhere apart from the productivity figures. We’ve got two major explanations. One is that we’re just not measuring it right. Google Maps appear nowhere in the productivity figures, for instance. And two, we say point to past examples like electricity and steam. It’s just natural for the trajectory of general-purpose technologies: You expect 20–30 years of constant experimentation before it appears in the productivity figures.

Via REUTERS

Vollrath: I agree. If you look historically at most of these central general-purpose technologies, we think of them as arising immediately — “the moment electricity came around, everything changed.” But that was a decades-long change, to the point that we were electrifying rural America into the ’40s, ’50s, and even the ’60s. So the fact that we haven’t seen massive productivity growth out of computing, say, or the internet doesn’t necessarily mean it won’t come.

Tucker: The way I’ve always liked to think about it is we initially thought that electricity was about illumination, and the productivity impact of that wasn’t amazing. On the other hand, putting electricity into factories — that was a real productivity revolution.

Strain: I agree will all that, but my optimism is not driven by the vague sense that, “This is how it’s always happened in the past.” If you look at specific technologies — say, batteries, artificial intelligence, or vaccines — you can see very specific, concrete progress that you can point to. The real question is, when are businesses going to figure out how to make money using them?

Cowen: I actually think the new innovations will be unique in that a lot of them will not contribute that much to per capita GDP. For example, a vaccine against HIV/AIDS or malaria would be incredible advances for humanity, but I don’t know how much it would show up in US per capita GDP or productivity. And the other new wave of innovation — green energy — is mainly helping us avoid a catastrophe. It’s boosting GDP relative to an awful counterfactual, but I’m not sure we’ll feel we have higher standards of living relative to before.

Vollrath: Yeah, we need to be conscious of the difference between a) technological optimism and technological advances in an exciting decade and b) how we measure GDP and productivity. They separate more and more over time. GDP and innovation used to be much more tightly linked because innovations were producing things that generated a demand for themselves. Conversely, a lot of innovations today are about replacing or removing an old product, so we maybe shouldn’t have our optimism keyed off of measured productivity or measured GDP statistics.

Strain: I want to agree and disagree. I agree that if we can create a malaria vaccine, that would be a game-changer for human welfare which wouldn’t show up in US productivity statistics immediately. But if we aren’t losing so much talent and skill to malaria, we might produce new generations of innovators and thinkers who could create things that increase productivity. My view is that, over a suitably long time horizon, the malaria vaccine would show up in the US productivity statistics.

If these innovations take a significant period of time to show up in data or people’s standards of living, do we risk having less societal tolerance for the disruption they cause? Do we need tangible signs of rising living standards to justify policies that promote growth and progress?

Vollrath: I think those innovations will eventually become tangible, whether it’s a malaria vaccine, AI, or even just the ability to do panels like this. But we’ve all also felt changes even over this stagnant last couple of decades, right? We live fundamentally different lives than we did when I was growing up, even without massive productivity growth.

Strain: I don’t think anybody’s looking at those statistics. Rather, I think they’re assessing how they feel about broader economic policy issues. They’re thinking about their own lives, and I really wonder if the pandemic is going to affect that. I mean, we just defeated the plague in a year using technology. So while it’s still too early to say, I wonder if there won’t be more warmth toward abstract notions of creative destruction and the importance of innovation as a consequence of what seems like a pretty stunning technological success.

Cowen: I’m worried that biomedical innovation progress will occur very quickly but the rest of the economy will stay relatively static, meaning we become older as a society more quickly than expected. That could result in a lot more status quo bias and entrenchment — we could innovate ourselves into a tighter complacency and stagnation.

Strain: I worry about that with inequality — say a new, revolutionary cancer therapy is invented but is extremely expensive. We could have a situation where, as a result of innovation, the rich don’t die from cancer whereas the lower middle class does — something of that nature.

Tucker: So, in the digital economic space, we have a lot of studies exploring the impacts of digital technologies on inequality that get many contradictory results. We’ve got your cancer example, but there are also papers which show that simple processes such as digitizing medical records actually enhance outcomes for poorer patients (because this mitigates the tendency of doctors to pay more attention to the more highly educated). So I think for every bad implication you have for reinforcing inequality, you can also find some rays of hope as well.

Will these new technologies, such as artificial intelligence, create widespread unemployment?

Tucker: While AI is a recent introduction — and the evidence is scattered — the current results from multiple studies on the implementation of machine learning seem to suggest that this will be a human-augmenting technology rather than a displacing technology.

Strain: I’m not worried about technological advances replacing human workers for the same reason I’m confident that all of these technologies are eventually going to be used to make the economy more productive: There’s just too much low-hanging fruit for businesses to pass up.

Sure, advances in technology might lead to some businesses and industries using fewer workers — that’s what productivity is — but that’s just going to create a bunch of human capital for businesses to use in new and different ways. The same basic process drives the adoption of new technologies: Businesses just can make too much money with these new technologies to ignore them.

But I am worried about the transition period. Advances in technology have been reshaping the labor market for decades, whether it be in the distribution of employment across occupations or in the methods of producing manufacturing goods. As a result, there’s much less employment in middle-skill, middle-wage occupations than there used to be, which has had enormous consequences for our society. And I expect that process is going to continue as businesses learn how to integrate newer technology. That is very disruptive, even if it does not spell the end of human work.

Vollrath: Also, it’s a good thing if we work less. So it will be interesting to see if we take advantage of some of these technologies to find a new equilibrium where we all ramp down our effort and enjoy more of our time.

That said, the real disruptions come when they are felt very heavily by a small group of people who are put out of work. So while, in the long run, AI might result in us working less, it’s hard to tell somebody, “Well, you guys lost your job today, but don’t worry — 30 years from now, everybody will be working 10 hours a week.”

If, in 10 years, it’s concluded that the great stagnation has continued, what would be the reasons you would attribute that to?

Cowen: I would say it was because we had some emergencies and we responded pretty excellently, but then we sunk back into our sloth.

Tucker: Right, I was thinking of a version of that: There’s been an amazing adoption of digital technologies due to the pandemic — including, for example, the number of small businesses that now have online short storefronts. Let’s be clear: This is a technology that’s been available for 15 years, but it was only the necessity that made it top of mind. So, for many of our advances . . .  are they going to be top of mind in the next 10 years?

Vollrath: I wonder if we’re willing to take the risks that are associated with innovations and implementing them. We’ve seen this in the EU with the vaccine, where, even when the benefit of a technology is pretty clear, they’re fighting against this very risk-averse policy setting.

Strain: Regarding public policy, I worry that it’s preventing the emergence of new technologies — whether it’s doing enough to support research and development or unnecessarily holding back entrepreneurism and business investment.

However, if a bunch of new technologies are invented but aren’t incorporated into business practices yet or don’t show up in productivity statistics, that’s a demand-side threat to innovation and to productivity. If we have less demand in the economy than we need to have in order to spur businesses to do creative and inventive things, then that’s a problem.

James Pethokoukis is the Dewitt Wallace Fellow at the American Enterprise Institute, where he writes and edits the AEIdeas blog and hosts a weekly podcast, “Political Economy with James Pethokoukis.” Tyler Cowen is the Holbert L. Harris Chair of Economics at George Mason University, and he serves as chairman and faculty director of the Mercatus Center. Michael Strain is the director of economic policy studies at AEI, as well as the Arthur F. Burns scholar in political economy. Catherine Tucker is the Sloan Distinguished Professor of Management Science and Professor of Marketing at MIT’s Sloan School of Management. Dietrich Vollrath is a professor of economics and the chair of the Department of Economics at the University of Houston.