From the steam engine and mechanization brought about by the Industrial Revolution to the rise of the knowledge--and service-based economy in the latter 20th and early 21st centuries, increasingly defined by Information and Communication Technology (ICT), this question has been debated.
Important advances in technology are necessary for industrial and developing economies alike to sustain continuing growth. It’s also indisputable that such innovations invariably lead to some degree of job loss, displacement and obsolescence. However, that doesn’t necessarily mean the “permanent” loss of available work and workers. Rather, it entails the retraining and reallocation of resources in a changing economy, and ultimately devolves to how well the work force and industry are able to adapt to technological innovation.
That is the secret sauce that can (and does) prevent massive unemployment when new technologies or processes threaten the status quo. What has become evident over time is that national (indeed, global) conversations about vanishing jobs are more prevalent on the heels of a major financial crisis when unemployment rates spike to alarming highs – as happened most recently in the U.S. during and after the financial crisis and ensuing Great Recession.
Invasion of the “Body Snatchers”
Advances in robotics and ever-more sophisticated robots are often depicted as the new American worker.
From factory floors to operating rooms and households where they do domestic chores, robots seem to be at last realizing some of the futuristic fantasies imagined in “The Jetsons.” And, a robot hired to do the same job as a human employee for the same salary represents a one-time cost (minus maintenance), wouldn’t require health insurance or a retirement plan, and could work a 24-hour shift, according to a recent article from The Washington Post.
At the same time, the robot can only assist an actual surgeon with micro-surgery procedures, and has to be programmed for the assembly line and monitored by a human supervisor. In other words, the growth of robotics promises to create a whole new industry dedicated to designing, building and maintaining automated workers – much like the automobile industry did in its time. And they haven’t (as yet) learned to reason the way only human brains can.
It is important to remember that technological innovations have inspired fear of job replacement for centuries. In early-18th century England, mechanical looms were sabotaged by mill workers fearing they would be replaced, purportedly led by a man named “Ludd” – hence the introduction to the lexicon of the word “Luddite,” someone who resists new technology or trends.
When post-war railroad and passenger ship travel gave way without warning to jet planes, which revolutionized travel and shipping processes while drastically reducing costs, railroad and shipping workers were no doubt put out of work in droves. At the same time, those who were able to transfer their skills or be retrained became part of an industry that today provides jobs for hundreds of thousands of full- and part-time employees.
Similarly, in our knowledge-based economy, digitization isn’t yet replacing human workers so much as making their jobs easier while creating a demand for a workforce that’s more technically savvy.
Back to the Future
Much as the recent economic downturn led to the loss of millions of jobs in financial services (some of them permanently) and many other industries negatively impacted by the Great Recession, similar fears of displacement of workers by technology occurred a half-century ago.
A serious recession in the early part of President John F. Kennedy’s administration resulted in a sharp spike in unemployment. It was an era when technology was exploding and the President himself called for a moon landing by the end of the decade, which actually was realized.
Many economists and observers at that time blamed massive job losses on technological advances and automation, which proved to be unfounded. On his website, economist Timothy Taylor revisits that period of history which prompted President Johnson to establish a “National Commission on Technology, Automation, and Economic Progress” and how the U.S. could adapt to the changing workplace.
Even though the unemployment level had returned to “full employment” (below 4%) by the time the commission published its findings in 1966, the report has relevance for how our economy responds to innovation. It highlighted that persistent unemployment in the Eisenhower years post-Korean War was due more to increased levels of productivity, growth in the labor force, and inadequate demand than to a technological revolution.
Sound familiar? That report has striking echoes for today’s economy, where the recessionary level of unemployment (near 10% and higher in some parts of the country) is now back below 6%, with 11.5 million jobs created since 2010. Are all of those jobs simply reinstatement of unemployed workers, some of whom chose to retire or dropped out of the workforce from sheer exasperation after looking unsuccessfully for several years? Of course not. Some employees were permanently displaced and were unable to compete with younger and better-educated workers at lower salaries.
Interesting, too, are that commission’s recommendations on how the U.S. could best cope with an increasingly automated society, some of which are still very much at the fore of political and economic debate in 2015. Among these are:
- A “…floor under family income,” as the debate over raising the national minimum wage continues;
- “…14 years of free public education” (i.e., President Obama’s recent proposal to make community college free and available to all students)
- A “…program of public service employment… providing work for ‘the hard-core unemployed’…” (the ongoing clarion call for the federal government to fund badly needed infrastructure redevelopment and public works projects, especially in the early days of the Recession)
There are, of course, many additional ideas. According to Taylor, the report called for changes in public policy and actions by the government, which, if they weren’t realistic then, seem less imaginable today than robots delivering the mail or serving coffee as baristas in Starbucks.
It’s a Small World, After All
Many of the technological innovations introduced to America at the 1964 World’s Fair, concurrent with President Johnson’s technology commission, may seem as quaint and outmoded in the 21st century as the commission itself – even though both events looked to the future and offered ideas that are still relevant to our lives. What also persists 50 years on is the real fear that shudders through the workplace when major changes are introduced and jobs are eliminated. A recent blog from The Washington Post blog puts it succinctly:
“Waves of technological advances have always left losers — people whose factories moved or shut; or whose skills became obsolete; or whose firms succumbed to new competition. Often, the new jobs aren’t where the old ones were and aren’t suitable for their workers.”
As the post's author indicates, if history is any guide, 21st-century innovation in our digital, knowledge-based economy will inevitably cost the loss, reassignment or rethinking, of some jobs and skills – much as all technological innovations do. The real game-changer would result if the U.S. economy and workforce were somehow unable to adapt, reeducate workers and reallocate resources where they are most needed.
In an increasingly global economy, it’s no longer a question of just protecting American jobs as developing economies challenge us; it’s about becoming more educated, more highly skilled, and more competitive overall to remain an industrial and ideological power.
Perhaps that’s ultimately the American “secret sauce”: our ability to accept change and reinvent ourselves and our workforce, again and again.