1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
Youngstown was transformed not only by an economic disruption but also by a psychological and cultural breakdown. Depression, spousal abuse, and suicide all became much more prevalent; the caseload of the area’s mental-health center tripled within a decade. The city built four prisons in the mid-1990s—a rare growth industry. One of the few downtown construction projects of that period was a museum dedicated to the defunct steel industry.
This winter, I traveled to Ohio to consider what would happen if technology permanently replaced a great deal of human work. I wasn’t seeking a tour of our automated future. I went because Youngstown has become a national metaphor for the decline of labor, a place where the middle class of the 20th century has become a museum exhibit.
“Youngstown’s story is America’s story, because it shows that when jobs go away, the cultural cohesion of a place is destroyed,” says John Russo, a professor of labor studies at Youngstown State University. “The cultural breakdown matters even more than the economic breakdown.”
In the past few years, even as the United States has pulled itself partway out of the jobs hole created by the Great Recession, some economists and technologists have warned that the economy is near a tipping point. When they peer deeply into labor-market data, they see troubling signs, masked for now by a cyclical recovery. And when they look up from their spreadsheets, they see automation high and low—robots in the operating room and behind the fast-food counter. They imagine self-driving cars snaking through the streets and Amazon drones dotting the sky, replacing millions of drivers, warehouse stockers, and retail workers. They observe that the capabilities of machines—already formidable—continue to expand exponentially, while our own remain the same. And they wonder: Is any job truly safe?
Futurists and science-fiction writers have at times looked forward to machines’ workplace takeover with a kind of giddy excitement, imagining the banishment of drudgery and its replacement by expansive leisure and almost limitless personal freedom. And make no mistake: if the capabilities of computers continue to multiply while the price of computing continues to decline, that will mean a great many of life’s necessities and luxuries will become ever cheaper, and it will mean great wealth—at least when aggregated up to the level of the national economy. But even leaving aside questions of how to distribute that wealth, the widespread disappearance of work would usher in a social transformation unlike any we’ve seen. If John Russo is right, then saving work is more important than saving any particular job. Industriousness has served as America’s unofficial religion since its founding. The sanctity and preeminence of work lie at the heart of the country’s politics, economics, and social interactions. What might happen if work goes away?
The U.S. labor force has been shaped by millennia of technological progress. Agricultural technology birthed the farming industry, the industrial revolution moved people into factories, and then globalization and automation moved them back out, giving rise to a nation of services. But throughout these reshufflings, the total number of jobs has always increased. What may be looming is something different: an era of technological unemployment, in which computer scientists and software engineers essentially invent us out of work, and the total number of jobs declines steadily and permanently.
This fear is not new. The hope that machines might free us from toil has always been intertwined with the fear that they will rob us of our agency. In the midst of the Great Depression, the economist John Maynard Keynes forecast that technological progress might allow a 15-hour workweek, and abundant leisure, by 2030. But around the same time, President Herbert Hoover received a letter warning that industrial technology was a “Frankenstein monster” that threatened to upend manufacturing, “devouring our civilization.” (The letter came from the mayor of Palo Alto, of all places.) In 1962, President John F. Kennedy said, “If men have the talent to invent new machines that put men out of work, they have the talent to put those men back to work.” But two years later, a committee of scientists and social activists sent an open letter to President Lyndon B. Johnson arguing that “the cybernation revolution” would create “a separate nation of the poor, the unskilled, the jobless,” who would be unable either to find work or to afford life’s necessities.
The job market defied doomsayers in those earlier times, and according to the most frequently reported jobs numbers, it has so far done the same in our own time. Unemployment is currently just over 5 percent, and 2014 was this century’s best year for job growth. One could be forgiven for saying that recent predictions about technological job displacement are merely forming the latest chapter in a long story called The Boys Who Cried Robot—one in which the robot, unlike the wolf, never arrives in the end.
The end-of-work argument has often been dismissed as the “Luddite fallacy,” an allusion to the 19th century British brutes who smashed textile making machines at the dawn of the industrial revolution, fearing the machines would put hand weavers out of work. But some of the most sober economists are beginning to worry that the Luddites weren’t wrong, just premature. When former Treasury Secretary Lawrence Summers was an MIT under graduate in the early 1970s, many economists disdained “the stupid people [who] thought that automation was going to make all the jobs go away,” he said at the National Bureau of Economic Research Summer Institute in July 2013. “Until a few years ago, I didn’t think this was a very complicated subject: the Luddites were wrong, and the believers in technology and technological progress were right. I’m not so completely certain now.”
2. Reasons to Cry Robot
What does the “end of work” mean, exactly? It does not mean the imminence of total unemployment, nor is the United States remotely likely to face, say, 30 or 50 percent unemployment within the next decade. Rather, technology could exert a slow but continual downward pressure on the value and availability of work—that is, on wages and on the share of prime age workers with fulltime jobs. Eventually, by degrees, that could create a new normal, where the expectation that work will be a central feature of adult life dissipates for a significant portion of society.
After 300 years of people crying wolf, there are now three broad reasons to take seriously the argument that the beast is at the door: the ongoing triumph of capital over labor, the quiet demise of the working man, and the impressive dexterity of information technology.
• Labor’s losses. One of the first things we might expect to see in a period of technological displacement is the diminishment of human labor as a driver of economic growth. In fact, signs that this is happening have been present for quite some time. The share of U.S. economic output that’s paid out in wages fell steadily in the 1980s, reversed some of its losses in the ’90s, and then continued falling after 2000, accelerating during the Great Recession. It now stands at its lowest level since the government started keeping track in the mid-20th century.
A number of theories have been advanced to explain this phenomenon, including globalization and its accompanying loss of bargaining power for some workers. But Loukas Karabarbounis and Brent Neiman, economists at the University of Chicago, have estimated that almost half of the decline is the result of businesses’ replacing workers with computers and software. In 1964, the nation’s most valuable company, AT&T, was worth $267 billion in today’s dollars and employed 758,611 people. Today’s telecommunications giant, Google, is worth $370 billion but has only about 55,000 employees—less than a tenth the size of AT&T’s workforce in its heyday.
• The spread of nonworking men and underemployed youth. The share of prime age Americans (25 to 54 years old) who are working has been trending down since 2000. Among men, the decline began even earlier: the share of prime age men who are neither working nor looking for work has doubled since the late 1970s, and has increased as much throughout the recovery as it did during the Great Recession itself. All in all, about one in six primeage men today are either unemployed or out of the workforce altogether. This is what the economist Tyler Cowen calls “the key statistic” for understanding the spreading rot in the American workforce. Conventional wisdom has long held that under normal economic conditions, men in this age group—at the peak of their abilities and less likely than women to be primary caregivers for children—should almost all be working. Yet fewer and fewer are.
Economists cannot say for certain why men are turning away from work, but one explanation is that technological change has helped eliminate the jobs for which many are best suited. Since 2000, the number of manufacturing jobs has fallen by almost 5 million, or about 30 percent.
Young people just coming onto the job market are also struggling—and by many measures have been for years. Six years into the recovery, the share of recent college grads who are “underemployed” (in jobs that historically haven’t required a degree) is still higher than it was in 2007—or, for that matter, 2000. And the supply of these “non-college jobs” is shifting away from high paying occupations, such as electrician, toward low wage service jobs, such as waiter. More people are pursuing higher education, but the real wages of recent college graduates have fallen by 7.7 percent since 2000. In the biggest picture, the job market appears to be requiring more and more preparation for a lower and lower starting wage. The distorting effect of the Great Recession should make us cautious about over interpreting these trends, but most began before the recession, and they do not seem to speak encouragingly about the future of work.
• The shrewdness of software. One common objection to the idea that technology will permanently displace huge numbers of workers is that new gadgets, like self-checkout kiosks at drugstores, have failed to fully displace their human counterparts, like cashiers. But employers typically take years to embrace new machines at the expense of workers. The robotics revolution began in factories in the 1960s and ’70s, but manufacturing employment kept rising until 1980, and then collapsed during the subsequent recessions. Likewise, “the personal computer existed in the ’80s,” says Henry Siu, an economist at the University of British Columbia, “but you don’t see any effect on office and administrative-support jobs until the 1990s, and then suddenly, in the last recession, it’s huge. So today you’ve got checkout screens and the promise of driverless cars, flying drones, and little warehouse robots. We know that these tasks can be done by machines rather than people. But we may not see the effect until the next recession, or the recession after that.”
Some observers say our humanity is a moat that machines cannot cross. They believe people’s capacity for compassion, deep understanding, and creativity are inimitable. But as Erik Brynjolfsson and Andrew McAfee have argued in their book The Second Machine Age, computers are so dexterous that predicting their application 10 years from now is almost impossible. Who could have guessed in 2005, two years before the iPhone was released, that smartphones would threaten hotel jobs within the decade, by helping homeowners rent out their apartments and houses to strangers on Airbnb? Or that the company behind the most popular search engine would design a self-driving car that could soon threaten driving, the most common job occupation among American men?
In 2013, Oxford University researchers forecast that machines might be able to perform half of all U.S. jobs in the next two decades. The projection was audacious, but in at least a few cases, it probably didn’t go far enough. For example, the authors named psychologist as one of the occupations least likely to be “computerisable.” But some research suggests that people are more honest in therapy sessions when they believe they are confessing their troubles to a computer, because a machine can’t pass moral judgment. Google and WebMD already may be answering questions once reserved for one’s therapist. This doesn’t prove that psychologists are going the way of the textile worker. Rather, it shows how easily computers can encroach on areas previously considered “for humans only.”
After 300 years of breathtaking innovation, people aren’t massively unemployed or indentured by machines. But to suggest how this could change, some economists have pointed to the defunct career of the second most-important species in U.S. economic history: the horse. For many centuries, people created technologies that made the horse more productive and more valuable—like plows for agriculture and swords for battle. One might have assumed that the continuing advance of complementary technologies would make the animal ever more essential to farming and fighting, historically perhaps the two most consequential human activities. Instead came inventions that made the horse obsolete—the tractor, the car, and the tank. After tractors rolled onto American farms in the early 20th century, the population of horses and mules began to decline steeply, falling nearly 50 percent by the 1930s and 90 percent by the 1950s.
Humans can do much more than trot, carry, and pull. But the skills required in most offices hardly elicit our full range of intelligence. Most jobs are still boring, repetitive, and easily learned. The most-common occupations in the United States are retail salesperson, cashier, food and beverage server, and office clerk. Together, these four jobs employ 15.4million people—nearly 10 percent of the labor force, or more workers than there are in Texas and Massachusetts combined. Each is highly susceptible to automation, according to the Oxford study.
Technology creates some jobs too, but the creative half of creative destruction is easily overstated. Nine out of 10 workers today are in occupations that existed 100 years ago, and just 5 percent of the jobs generated between 1993 and 2013 came from “high tech” sectors like computing, software, and telecommunications. Our newest industries tend to be the most labor-efficient: they just don’t require many people. It is for precisely this reason that the economic historian Robert Skidelsky, comparing the exponential growth in computing power with the less-than-exponential growth in job complexity, has said, “Sooner or later, we will run out of jobs.”
You can read up to 3 premium stories before you subscribe to Magzter GOLD
Log in, if you are already a subscriber
Get unlimited access to thousands of curated premium stories, newspapers and 5,000+ magazines
READ THE ENTIRE ISSUE
July - August 2015