We need to prepare for a new world of work before it’s too late
Several major reports came out during the past couple of months, notably a report from Georgetown Center for Education and the Workforce, the OECD 2019 Skills Report: Thriving in a Digital World, and the new initiatives by the College Board to account for structural disadvantages, which, taken together, should be the tsunami warning bell that I have been trying to ring for more than decade. The American dream has been on life support for a generation, and without intervention it may be unrecoverable. We are at the edge of the chasm between the Third and Fourth Industrial Revolution. In the Third Industrial Revolution, we needed workers trained in technical skills to build out the infrastructure for the Fourth Industrial Revolution. We lunged at STEM skills and focused on education that we could prove through standardized testing. In this era, the higher-education-to-industry factory pipeline worked for people with the required degrees. We pushed more and more people through that pipeline, with a focus on monolithic degrees at increasing cost. We failed to pay attention to the ways in which technology was supplanting workers. Fewer workers could produce more. Those without higher levels of education, notably men with only a high school diploma, dropped out of the workforce, many of them into mental health crises, and there have been more deaths of despair. The Wall Street Journal just published an article entitled “Families Go Deep in Debt to Stay in the Middle Class.” Cars, college, houses and healthcare have become considerably more expensive, with consumer debt, not counting mortgages, currently at $4 trillion—higher than it has ever been, even after adjusting for inflation. Simultaneously, for two decades incomes have been stagnant. Pew Research reports that between 1971 to 2011 we lost about 10% of the middle class.
I believe that if we act immediately, we may be able to resuscitate the American dream. The challenge is that what worked in past industrial revolutions will not work in the next one. We must think differently, both to prepare the next generation for the future of work and to triage current workers to determine the best courses of intervention to help them adapt to a work world for which they have not been prepared.
Origin of the Term “American Dream” and Building the Middle Class
In 1931, author James Truslow Adams coined the term “American dream” in his book The Epic of America, in which he says that “life should be better and richer and fuller for everyone, with opportunity for each according to ability or achievement” regardless of social class or circumstances of birth. We have long taken this to mean that the American dream is doing better than one’s parents. The American dream is securing a stable place in the middle class or higher. The American dream is home ownership or some other determinant of stability. Relatively speaking, we have not had a middle class for all that long. The Second Industrial Revolution and the post-WW2 boom built our middle class as mass manufacturing changed the way we lived our lives and launched our consumer society. In that period, we also created work as a concept separate from home, and the fixed occupational identity was born.
Loss of Social Mobility
Recent research by Raj Chetty has found that if you were born in 1940, you had a 90% chance of doing better than your parents; if you were born in 1984, that chance dropped to 50%. While the 1940 cohort is perhaps unusual in that it enjoyed an unprecedented economic post-WW2 economic boom, even the 1950s cohort showed higher rates of mobility (see chart). The Organization for Economic Cooperation and Development (OECD) recently released a report that revealed that the middle class is shrinking across all developed countries but even more so in the United States. While we have lifted developing nations out of poverty and established their middle classes, ours has eroded.
Rising Income Inequality in the United States; Declining Income Inequality Globally
The US economy is in new territory, and while most celebrate the new stock market milestones and the lowest unemployment rate in fifty years, a closer look reveals a more textured picture. Not everyone is doing well. According to research by Stanford Graduate School of Business professor and economist Paul Oyer, “Times are good if you are college educated and working in the right industries in the right locations. But the last 50 years have been terrible for people with lower skills. Adjusted for inflation, the average earnings of a man who didn’t go to college is lower now than it was 50 years ago. That’s unheard of.” And more specifically, Professor Oyer notes that we reduced income inequality globally while increasing it in the United States. “The low-wage jobs that left here are considered really good jobs in China. [Globally] we’ve lifted a billion, two billion people out of poverty over the past 30 years,” Oyer wrote. Erik Brynjolfsson and Andrew McAfee, faculty members at the MIT Sloan School of Management, have studied the relationship between technology and the economy for years. Their efforts culminated in a seminal book, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies,in which they chronicle how digital technologies are contributing to the stagnation of wages. Specifically, Brynjolfsson explains in a 2015 Harvard Business Reviewinterview, “In the 1980s, however, the growth in median income began to sputter. In the past 15 years it’s turned negative; once you adjust for inflation, an American household at the 50th percentile of income distribution earns less today than it did in 1998, even after accounting for changes in household size. Job growth in the private sector has also slowed—and not just because of the 2008 recession. Job gains were anemic throughout the 2000s, even when the economy was expanding. This phenomenon is what we call the Great Decoupling. The two halves of the cycle of prosperity are no longer married: Economic abundance, as exemplified by GDP and productivity, has remained on an upward trajectory, but the income and job prospects for typical workers have faltered.” Indeed, according to the Economic Policy Institute, from 1945 to 1973 the top 1% captured 4.9% of all income growth, while from 1973 to 2007, more than 58% of income growth went to the top 1%, and since the global financial crisis the top 1%’s share of income growth has held steady at more than 40%.
Socioeconomic Status Is Now Not Earned but Rather Inherited
According to the Georgetown Center for Education and the Workforce Report Born to Win, Schooled To Lose, “[an advantaged] kindergartener with test scores in the bottom half has a 7 in 10 chance of reaching high socioeconomic status among his or her peers as a young adult, while a disadvantaged kindergartener with top-half test scores only has a 3 in 10 chance.” Students who fall behind stay behind because “advantaged students have safety nets to keep them on track. Because less-advantaged peers do not, they are more likely to fall behind and stay behind. Among children who show similar academic potential in kindergarten, the test scores of economically disadvantaged students are more likely to decline and stay low during elementary, middle, and high school than the test scores of their high-SES peers.” Those students with lower socioeconomic status (SES) complete higher education at much lower rates than their higher-SES peers. Social mobility is lost, and with it a tremendous amount of human potential. Professor Oyer weighs in on this as well: “The big payoffs are early interventions. Getting kids into preschool, having them not fall behind. There’s a lot of evidence that by third grade, some kids are far behind and never catch up. For a lot of children, when they’re 8 years old, it’s too late. If your parents aren’t engaged in the education system, in 10 years you’ll be competing in the labor market against people whose parents sent them to violin lessons and summer programs in the Dominican Republic. There’s inequality of opportunity.”
Massification of Higher Education: The Monolithic Degree
As we moved from the Second to the Third Industrial Revolution, we suddenly needed a skilled and trained labor force educated beyond high school. Simultaneously, we began the long task of creating a more perfect union through the women’s movement, civil rights movement, and GI Bill, which stimulated the supply side of the now more diverse and more educated labor force. In this phase, 1960 to 1990, we doubled the number of higher education institutions in the United States. Following this expansion, everyone jumped onto the gravy train: magazines that offered university rankings; textbook publishers who could increase prices at double the rate of inflation; institutions of higher education who could increase tuition above inflation because student loan solutions emerged, pushing the pain down the road. The return on investment for higher education justified the costs until, perhaps, that came into question. During the massification period, higher education became myopically focused on monolithic degrees. “Pick a good major” was advice that, when followed, guaranteed a good, stable ride up the career escalator. Universities trained students in a single skill set, in a single industry, and if they attained a decent starting salary, their schools could declare success. This all worked until somewhere between the dot-com bust and the global financial crisis, when the assets leveraged to fund that future salary depreciated and, simultaneously, technological capability extended the leverage of the knowledge worker such that few single-skill-set workers were needed. And now, we have unmet needs for a highly educated workforce in fields like data analytics and cybersecurity, yet we have a huge number of students graduating with debt and degrees they cannot monetize because the skills they were taught are outdated and irrelevant. Further, we have focused almost exclusively on training people in technical skills with the false promise that they would be able to leverage that training for their entire career rather than as a starting point in a long arc of lifelong learning. Additionally, the greatest skills gaps today are not technical, they are social, and we not developing the uniquely human skills we need in the workforce. Research by David Deming found that jobs with high math skills and low social skills have been on the decline for decades, while the number of jobs requiring high social skills and low math skills has been increasing in almost an inverse correlation. Our systems of education and development are not clued into this need at all.
K-12 as Workforce Prep = Disengagement
In our efforts to improve education in this country we may be shooting ourselves in the foot. We are myopically focused on proving learning, but that which can be proven is also that which is easy to automate. Our rush to push children through summative testing (by teaching to the test) is rapidly diminishing their engagement at a time when we most need young people to adapt habits that will enable lifelong learning. Gallup studies engagement, and it found that as children progress through primary school, middle school and high school, they become increasingly disengaged. This is a profound loss of human potential.
Potential Impact of the Fourth Industrial Revolution
According to Klaus Schwab from the World Economic Forum, we have now entered the Fourth Industrial Revolution, which is marked by the merging of biological, cyber, and physical systems. The First Industrial Revolution was fueled by the steam engine; the Second was run on electrification, mass production, and division of labor; and the Third is marked by computerization and the automation of physical labor. Our systems of learning were formed to train farm workers to be factory workers in the First and Second Industrial Revolutions and have changed little since. The past two or three Industrial Revolutions required a trained and deployable workforce and for them we built systems of learning that would codify and transfer existing knowledge and preselected skills. Through a combination of globalization and technological change, products, services, and business models are rapidly changing, requiring new skills, new talents, and, many conclude, new workers. The fill-and-spill model of hiring for past skills and experience is becoming truly Sisyphean, since many of the skills we need are just emerging, and we don’t have the luxury of the seven to ten years we need to develop training and transfer those skills and knowledge into a new set of workers. The Fourth Industrial Revolution may be most notable for the (predicted) waves of technological unemployment of knowledge workers, because anything mentally routine or predictable may soon be achievable by an algorithm. In this reality, the value of having stored knowledge may diminish in favor of the ability to work in flows of information, form new knowledge, and/or create the most meaning from processes handled by technology. Computers are very good at answering specific questions and, thus far, very bad at deciphering intent, inferring meaning, and offering judgment. It seems, then, that it is no mistake that almost every work-skills-of-the-future list is focused on nontechnical and uniquely human skills (sometimes called soft skills). These are exactly the skills left out of our current education-to-work factory pipeline, where we have favored lunging at rapidly expiring technology skills and focusing on only that which we can measure—which is also exactly that which we can automate.
Dunning-Kruger Effect Meets the Fourth Industrial Revolution
Often people who are most vulnerable to being displaced by (rendered unemployed by) technology are the least aware of their exposure. This brings to mind the famed Dunning-Kruger Effect of then-Cornell psychologists David Dunning and Justin Kruger, which states that the less competent you are, the less able you are to accurately assess your abilities. This seems to be applicable to our current challenging shift in the workforce. We need to simultaneously up-skill our people to a baseline of digital fluency, orient them toward lifelong learning and adaptation, and encourage them to develop their uniquely human skills (soft skills), such as communication, creativity, empathy, and judgment. A study by Swinburne University’s Centre for the New Workforce in Australia found, “The more an industry is disrupted by digital technologies, the more that workers in those industries value uniquely human ‘social competencies.’” These are the exact skills we are not paying attention to right now in either education or the workforce. From collaboration, empathy, and social skills to entrepreneurial skills, these social competencies are less vulnerable to being displaced by AI and automation and thus more likely to enhance a worker’s resilience to the changing nature of the market.
A study commissioned by the freelancing platform Upwork titled “Freelancing in America: 2017” found that 65% of freelancers are updating their skills, yet only 45% of full-time employees are doing the same. So those most affected by the winds of change and those most in need of updated skills to do their daily jobs, or gigs, are, at the moment, the most prepared. This leads me to believe that there remains a tremendous amount of work to do to prepare the entire workforce for the necessary changes as, according to McKinsey’s report Digital America: Tales of The Haves and Have-Mores, we have digitized less than 20% of our economy. MIT Technology Review created a chart tracking all the predictions on job loss and change brought on by advances in technology, and labor predictive analytics platform Faethm created a graphic of this data (below).
Enter Longevity and Accelerated Change
While income inequality is widening, social mobility is stalled, socioeconomic status is largely inherited, and we are facing intense pressure from accelerating technological change that will upend jobs and reshape work, there is one more change requiring adaptation: we are living longer—a lot longer. We have made the most significant leaps in human longevity in the last few decades and we are living longer and thus working longer through more change cycles.
My Thesis: Work to Learn
The future of work is both learning and adapting. Learning, framed as education, has long focused on “becoming educated,” which meant acquiring predetermined skills and existing knowledge such that students were able to secure a job and step on the career escalator. Or, said differently, we learned, once, in order to work. Now we are moving into a world of work, notably as we cross into the Fourth Industrial Revolution, where anything mentally or physically routine or predictable may be better achieved by an algorithm or some form of physical or knowledge-based automation. In this reality, my colleague Chris Shipley and I believe we need to learn how to learn so that we can “work to learn” and adapt for the rest of our—now much longer—careers. While it has long been believed that learning is best for the young because of brain plasticity, new research suggests that while some capabilities, such as fluid intelligence, peak in youth, other abilities, notably those required for the top skills of the future, peak after the age of 40.
Rebuilding the American Dream
A number of factors contributed to where are today, but the question is, where are we going? I am not a policy expert, so I will leave the answers to others more qualified, but I do think we are framing our challenges with the rear-view mirror rather than the windshield. Somewhere in the Third Industrial Revolution, humans shifted from assets to develop to costs to contain, and in that period we dramatically underinvested in human capital. In that same period, higher education, for those lucky enough to have access to it, became “pick a good major, get a good job, and ride the career escalator up.” This promise, articulated in this way, is gone. At the same time, we became myopically focused on wealth preservation—keeping, in the present, as much as possible of our value created in the past. This has been called the shareholder value era. We focused on extracting and preserving value rather than creating it. Those with the right degrees and/or the right skills enjoyed returns. Those without the right skills or without access to them stagnated, at best. Across the board, we stopped investing in humans. We allowed structural barriers to human potential to emerge. We stopped taking longer strides. At this inflection point, we need our dreams to focus, not right in front of us, but on the horizon—in other words, we need generation- or multi-generation-long dreams. We need to focus not on how we preserve wealth but how we invest in the next generations to create the taxpayers of the future. Where do we invest and how do we dream if our goal becomes maximizing human potential? For this, I truly believe, the future of work is learning, which to be realized requires, among other things, early interventions, universal preschool, and systems that support lifelong learning, because evidence suggests that various cognitive capacities peak throughout the whole, now much longer, arc of our lives. This is a new deal. This new deal will require corporate and government investment in lifelong learning and changes in our individual expectations. To learn is now everyone’s individual responsibility. To do anything less would be a loss of human potential and economic opportunity. To do anything less would be to stop aspiring to be a more perfect union. To do anything less would be un-American.
This piece was originally posted on Forbes.