While the number of available jobs is rising, workers' skills and technology-driven jobs may be at odds. Here's what societies could do to address the impact that automation and artificial intelligence may have on labor.
When Ken Jennings was defeated by IBM’s Watson on the TV show Jeopardy! some five and a half years ago, he remarked, half in jest, that he welcomed our new computer overlords. As someone with a degree in computer science, several million in winnings, and still considerable social media presence, he can perhaps afford to be sanguine. But what about the rest of us? Will advancements in technology, automation and artificial intelligence threaten our jobs?
Some pundits certainly think so, and even some academics have weighed in at varying levels of alarm over how easily new technologies will automate away many of the tasks currently done by humans. It is certainly undeniable that technology is changing the nature of many different kinds of jobs. While the role of technology on middle-skill, blue-collar jobs—think automobile workers and robotic arms or warehouse workers and Amazon’s Kiva robots—is often the focus of popular press articles and academic studies, technology is affecting both high-paying and low-paying jobs, too. Lawyers, for example, are increasingly using data analytics to mine important documents and case law in preparation for trials. Even food bank workers, in some cases, no longer just help clients with food security, but use data analysis and mapping software to better and more proactively reach potential clients.
Automation thus can reduce the demand for some skills—nimbly fetching items from a shelf, bolting metal sheets together and navigating an airplane—while increasing the demand for others—analyzing data, communicating effectively with different audiences and understanding computer numerical control devices. Workers whose skills are in the first group tend to experience stagnant or falling wages or even job loss. Workers in the second group often see faster wage growth and more employment opportunities.
Technological change is not new, of course, and forms of it have changed the demand for skills dating back to at least the Industrial Revolution. The growing concern, however, is that this time is different: the pace of this change is accelerating, perhaps faster than we as a society can adapt to, and that as a result, many of us will be jobless. This anxiety is not entirely unfounded, but it is misdirected.
In the aggregate, the number of job openings has actually been steadily rising, not falling, and businesses perennially complain that they cannot find enough qualified job applicants. Firms clearly still want and intend to hire people, and not just machines, to do work. Artificial intelligence and robots are not going to replace human labor in entirety—or even in majority—any time soon.
However, there is ample reason to worry that a large number of workers do not have—and are not learning—the skills that employers demand for decently-paid jobs. Some argue this is because schools struggle to quickly adapt to teach the skills businesses want. Others argue this is because businesses are less likely to provide training than in the past, instead expecting many new employees to hit the ground running. Both arguments are true, but sidestep the deeper policy problem, as employers and schools have already begun working more closely to better align skill supply and demand for new workers.
Rather, the more pressing issue is how to address incumbent workers whose skills don’t keep pace with employer demand and thus put them at risk of losing their jobs. This risk has hit manufacturing workers hard over the past decade and a half, as well as office support workers, and—as computers continue to grow more sophisticated—may reach many occupations that typically require a bachelor’s degree.
Compounding the concern, recent research has shown the risk of technology-driven job loss is not so much gradual and steady, but sharp and episodic, concentrated during and after recessions. Firms use these times, when demand for their products is relatively low, to replace their physical capital and to increase their demand for skills, substituting newer workers with in-demand skills for displaced workers with obsolete skills. As a result, many of the workers who lost their jobs struggle to regain employment with skills no longer demanded, contributing to jobless recoveries even as the economy improves, and straining employment service agencies.
One possible solution—already practiced by some of the highest-skilled jobs—is lifelong continuing education, with periodic refresher training to make sure that skills are kept up to date. Routine already among lawyers and doctors, who generally receive such training through their employers or are paid enough to afford it on their own, continuing education will likely be a necessary component for job security for a greater swath of the workforce in the near future. While some businesses have already begun to expect their workers to do this retraining on their own, many workers—especially those in middle- and lower-paying jobs—will need help. Unfortunately, neither business nor government has taken much initiative to help retrain workers before it’s too late, and that is a shame.
(Top image: Courtesy Getty Image.)
Brad Hershbein is an economist at the W.E. Upjohn Institute for Employment Research. The opinions expressed in this article are his own and do not necessarily reflect the views of the Upjohn Institute.
All views expressed are those of the author.