I’ve written a number of pieces now detailing how progress in technology is impacting our lives. The bottom line is that remarkable strides in recent years have made current realities out of things that would have fallen into the category of science fiction not that long ago. Many of these developments will wind up having a direct impact on employment and the work force. This isn’t a new phenomenon. That is what technology does. As long as there has been technology, it has altered the workforce.
The challenges posed by this became clear in the industrial revolution. The term “Luddite Fallacy” in economics refers to the belief that technology always creates more jobs than it destroys. The name comes from a movement of English textile workers who went around smashing machines that were used to create textiles, which were taking over the types of jobs those particular textile workers had been paid to do. I would argue that the Luddite Fallacy must become a fallacy itself at some point in technological advancement. If one simply imagines sufficiently advanced technology that can do everything a human can do but cheaper, there is clearly no point in employing humans anymore. So at some point, technological advance will destroy more jobs than it creates. The question is, have we crossed that point? Or are we nearing it, and, if so, what should be done about it?
There is some evidence that we have already crossed some type of tipping point. A standard piece of evidence for this comes from plotting productivity and compensation over time. For decades, these two values rose together. Recently, they split. Exactly when they split depends on exactly what values you use for compensation, but no matter what values you use, they diverge somewhere between the mid-1970s and 2000. Strongly correlated to this is the fact that labor costs as a percentage of GDP have been in decline since the 1980s. One argument for why this is happening is that in the age of computing, it is increasingly beneficial to put more money into upgrading machines instead of paying humans to do things.
Thanks to the new capabilities of AI and large-scale data analysis, more and more areas are becoming automatable. It is important to realize that automation isn’t just about machines doing the same things that humans could do for cheaper. It is about opening new possibilities that aren’t options with human workers. An example of this that I have seen is a robotic construction worker. It might not be any better than a human at putting the structure together, but unlike a human, the location of every 2×4, nail, pipe and electrical conduit can be precisely cataloged. This means that with recent advances in virtual reality and augmented reality, the homeowners can look around five years later and see where those things are in the wall so that they know where it is safe to put in a nail to hang something.
Of course, by many metrics, the rise of automation doesn’t seem to be a problem because the economy is great. Unemployment is roughly 5 percent, which is generally considered full employment. The stock market is at all-time highs. We have nearly 80 consecutive months of private sector job growth.
However, not everyone is feeling the benefits equally, a fact that likely played a role in the most recent presidential election. We see this in other statistics, like the flat median income and the fact that the labor force participation rate is under 63 percent, more than 13 percent below the value in January 2007.
The economy is strong for those who have skills that are in demand, skills that make them hard to outsource or replace by machines at this time, but not for many others. This fact is visible in wages and unemployment levels broken down by levels of education.
While the non-uniform economic recovery likely played a significant role in the election, that doesn’t mean that any of the proposals offered by any of the candidates will actually help. Shutting borders and reducing trade isn’t going to bring jobs back to the U.S., and nothing is going to stop the march of technological progress. Machines are going to replace more jobs, and the new jobs that are created are going to require higher level skills and likely more education than the ones that are removed.
So what can we do? If we have passed or are nearing a tipping point where technology destroys more jobs than it creates, this poses a real problem for our current economic system. We are a consumption-driven economy, and you can only consume if you have income. That income currently comes from having a job. As the percentage of the population that isn’t able to participate in the workforce grows, it creates a headwind on consumption which will have a negative impact on the economy, even for those who have the skills that are in demand.
So how do we deal with this? The solution that I prefer is a Universal Basic Income, or UBI. The idea is to pay every adult U.S. citizen a modest amount every month. Unlike current welfare programs, a UBI isn’t degrading, because everyone gets it. Also unlike current welfare programs, the UBI doesn’t strongly discourage work because it doesn’t scale down when you get a job.
How big should it be and how do you pay for it? Those are more challenging questions, but it is worth noting that it can start off small and grow over time, and there are multiple models for how you would pay for it that go beyond the word limit of this piece. The key message that I want readers to take away is that we aren’t going to recreate the economy of the past no matter what we do. Instead, we need to look for ways to make the economy of the future one that works for everyone.
Mark Lewis is a professor of computer science. He’s also an avid rollerskater.