We seem to be living in the golden age of innovation. From quantum computing and gene editing to self-driving cars and blockchain, technology is increasingly pushing the limits of what is possible
However, we are also living in a period of prolonged low productivity growth, which economists define how efficiently we create goods and services.
How is that possible, especially with all of our engineering prowess and whiz bang technology? We can instantly find information about anything by Googling it. We have smartphones that use GPS to get us to a destination in the shortest amount of time. We can tell Alexa to turn on televisions and washing machines and turn off alarms and lights.
The falling cost of computing power alone has allowed tech startups to quickly disrupt industries and scale into unicorns worth billions of dollars in just a few years time.
But the data is clear. From the fourth quarter of 2007 to the third quarter of 2016, labor productivity grew at an annualized rate of just 1.1 percent, according to data from the U.S. Bureau of Labor Statistics (BLS). For comparison, the average growth rate over ten business cycles dating back to 1941 was 2.5 percent.
For the United States to catch up with historical trends, we would need to generate annual growth rates of labor productivity that exceed 7 percent. That’s unlikely to happen.
Why does this matter? Low productivity means U.S. workers have to work more hours to produce the same level of goods and services. In other words, we have to work harder just to retain what we already have.
“The historically low rate of labor productivity growth during the current business cycle has limited gains in living standards for Americans during this period,” according to a report by the BLS. “In addition, low productivity growth has limited potential gains in worker compensation and in shareholder profits.”
Hard to measure innovation
Let’s start with the fact that we don’t really know how to measure innovation well or its economic impact on society.
The most obvious thing is to look at patents. Universities and companies normally protect their inventions by filing for patents with the U.S. Patent and Trademark Office (USPTO). In 2017, U.S. universities and colleges won 7,459 patents, a 30 percent jump from 2013, according to the latest data from the Association of University Tech Managers (AUTM).
Last year, the Bloomberg Innovation Index ranks the United States 4th out of 50 countries in terms of patents.
But patents don’t necessarily tell you anything. Universities and companies sit on a lot of patents but don’t necessarily do anything with them. And many patents only cover minor or incremental innovations, not the kind that really move the needle.
What about research and development? American corporations spend billions of dollars on R&D every year.
From 2005 to 2015, business R&D spending soared 78 percent to $355 billion, according to the National Science Foundation’s National Center for Science and Engineering.
Yet research shows that companies are not getting as much bang for their buck.
Anne Marie Knott, a professor of strategy at Washington University’s Olin School of Business in St. Louis, analyzed the performance of publicly traded companies since 1972, measuring productivity by comparing increases in R&D spending with increases in annual revenue.
She found that corporate returns on R&D spending actually declined 65 percent.
“R&D has not been as productive as it was four decades ago,” Knott told the San Francisco Chronicle.
Bronwyn H. Hall, a UC Berkeley professor of economics, wrote a paper that argued that while we can measure R&D’s impact on the revenue of individual companies, we haven’t yet figured out how to do the same thing with companies’ productivity as a collective group.
More importantly, she said, we don’t know how to quantitatively measure how innovation impacts social good. In other words, how does technology change the way we distribute goods and resources that maximizes the public good?
Businesses not always ready for innovations
A recent paper by the National Bureau of Research offers a possible answer to our productivity problem.
Technologies like the Internet and AI are “the defining technologies of (our) times and can radically change the economic environment,” the paper said. “They have great potential from the outset, but realizing that potential larger intangible and often unmeasured investments and a fundamental rethinking of the organization of production itself.”
In other words, it takes much more than money for innovation to really impact the economy. It also requires companies to change their business models, hire and train workers, and how they make and distribute products and services. These factors are harder to measure and require a longer period of time to see an impact.
But innovations of late have become so transformative that they often outpace companies’ ability to make the necessary adjustments to exploit these technologies.
Take the Internet. Amazon emerged in the 1990s but traditional brick and mortar retailers are still struggling to figure out e-commerce. Witness the recent closings of Payless Shoes, Circuit City, and Borders.
Newspapers is another good example. The Internet severely reduced the amount of advertising dollars that go to printed publications. But newspapers and magazines have yet to figure out a business model to replace that lost income.
The biggest challenge for companies is arguably AI. How will automaton impact the workforce? What skills will employees need? How will AI eliminate some professions but create new opportunities?
The bottom line: universities and companies create technologies that can do amazing things. But figuring out how to use them to boost productivity and the economy at large is an entirely different matter.