We find a long historical record of innovation that shows technological change has been overwhelmingly positive for productivity and surprisingly benign when it comes to employment. Job displacement has occurred in waves, first with the structural shift from agriculture to manufacturing, and then with the move from manufacturing to services. Throughout, productivity gains generated by new technology have been reinvested, and the GDP bounce from that productivity eventually raises consumption and, on balance, increases the demand for labor.
In McKinsey’s research, we have sought to quantify this trend and its net impact in several ways. For example, we looked at the very big picture — what happened to employment as productivity grew over the past 50 years. We found no tradeoff between the two: for 100 percent of continuous 10-year periods in the United States, technologically driven productivity gains have gone hand in hand with rising employment. Even when measured in one-year snapshots, productivity and employment have risen together four years in five.
The personal computer provides an iconic case to support the compatibility of technological progress with job growth. We measured how many jobs were lost and how many gained in the United States between 1980 and 2015 as a result of the diffusion of computer technology. Several hundred thousand bookkeepers and auditing clerks, secretaries and typists did lose their jobs. But the overall balance was strongly positive: the desktop/laptop computer created more than 19 million jobs in industries ranging from computer hardware to enterprise software to online retail sales, against the loss of about 3.5 million jobs — a net gain of 15.7 million. That figure amounts to 18 percent of all the net U.S. employment created in the period — almost one in five jobs.
Something similar happened with the invention of the automobile, which destroyed the horse and buggy business but gave rise to a wealth of new, sometimes unimagined products, from petroleum-based synthetic rubber to motels.
That said, we ignore those left along the way at our peril. For the many workers who are displaced, painful transitions are a reality. In Britain during the industrial revolution, starting in about 1800 wages of the masses stagnated for almost a half century despite a strong surge in productivity that enriched the owners of capital — a phenomenon often referred to as “Engels’ pause” because it was first noted by the German philosopher who co-authored (with Karl Marx) The Communist Manifesto.
Wages picked up again as the fruits of productivity gains eventually trickled down to workers. But “eventually” can be a long time: the Luddites’ fears were not entirely unfounded. And once again today, the link between productivity gains and wage growth seems bent if not broken. U.S. median hourly compensation rose only 11 percent from 1973 to 2016, even as hourly labor productivity grew by 75 percent.
While it is natural to focus on labor displacement through technology, one element often overlooked in the discussion of the future of work is the number of jobs that are neither lost nor gained, but fundamentally changed for reasons that are hard to predict. History again provides some examples.
The challenges here for policymakers and business leaders worldwide are significant. We will need to institute retraining of workers on a scale we have not seen for generations. We will need to rethink and adapt our social systems to help hundreds of millions of workers affected by the new technologies — hopefully handling it better than we have handled globalization in providing both new forms of income and job-transition support. We will need to find ways to tame the wilder, unpredictable side of AI. For their part, CEOs will need to rethink their organizational structures and processes to take advantage of the technologies and the performance-enhancing effects they can have — or risk ending up in the recycle bin of business history.
Given the scale of the occupational and skill shifts that we see coming in the next decade, much more will need to be done — and not just by business. Governments and educational establishments working together with philanthropic foundations will, of necessity, be part of the mix. Foundations have the advantage of being able to test innovative programs as pilots — and will not face the disadvantage of concern that workers with new skills will subsequently jump ship for a new employer.
As a starter, governments will need to reverse the 20-plus-year decline in spending on labor-related services — no small task in light of tight government budgets and, in the United States, reflexive opposition to government intervention. They will also need to be more flexible in order to adapt to the rapidly changing workplace.
Mobility is one key, yet labor markets have become markedly less dynamic in recent years. In the United States, for example, job reallocation rates have declined by 25 percent since 1990. This is the time to test ideas such as “portable” benefits that are owned by workers and not tied to particular jobs or companies. Technology itself can help facilitate greater mobility, especially digital platforms turbocharged by AI that enable matching of people to jobs.
But we also need to rethink some basic assumptions. In order for changing jobs across sectors to become a common practice, companies will need to agree on definitions and qualifications for specific skills. And a major debate is overdue about credentialing. Today, higher education encourages the world to judge graduates by their subject knowledge rather than by their skills in problem-solving and creative thinking — skills that will be especially valuable in an AI-enhanced workplace.
More than anything, we need to prepare for this new era in which humans must work alongside smart machines far more intensively.