How the "Job" Became the Center of American Life
Once upon a time, there were good jobs.
These jobs paid people enough money to live on, even enough to support a family. They provided health insurance so people could go to a doctor if they got sick. They even came with pensions so that once you’d worked for a certain number of years, you could actually stop working. You could rest.
But there was a problem.
These jobs weren’t for everyone. They were mostly for white men, and mostly in certain places, like a factory or an office. For everyone else, there were jobs that paid less, with fewer benefits — or no benefits at all. And over time, there were more and more bad jobs and fewer and fewer good jobs, and even the good jobs started getting less good, and everyone was very tired, and there was not enough money.
Then there was a plague.
While this little fable may oversimplify the history of work in America over the past century, it’s not that far off.
Since about the 1940s, Americans have been encouraged to look to their jobs for nearly all of life’s necessities: a living wage, health insurance, and retirement benefits, as well as intangibles like friendship, identity, and a sense of purpose. But these benefits were never universal, and they became less and less common as the years went by.
But the pandemic has also been a turning point for many workers, leading them to reevaluate their jobs in the face of new dangers — or a realignment of priorities brought on by a once-in-a-lifetime public health disaster. Indeed, the pandemic has led to record numbers of people quitting their jobs — 4 million this April alone, a phenomenon so widespread it’s been called the Great Resignation. And it’s leading employers, policymakers, and society at large to rethink jobs and how they dominate our days.
“I think it’s changed everything, and I think it’s changed everything fundamentally,” James Livingston, a history professor at Rutgers University and the author of No More Work: Why Full Employment Is a Bad Idea, told Vox.
We’ll (probably) always have work, but could the job as the centerpiece of American life be on the way out?
To understand the question, you have to know how the country got to where it is today. The story starts, to some degree, with a failure. Much of American labor law — as well as the social safety net, such as it is — stems from union organizing and progressive action at the federal level in the 1930s, culminating in the New Deal. At that time, many unions were pushing for a national system of pensions not dependent on jobs, as well as national health care, Nelson Lichtenstein, a history professor at the University of California Santa Barbara, told Vox. They did win Social Security, but with many people left out, such as agricultural and domestic workers, it wasn’t a full nationwide retirement system. And when it came to universal health care, they lost entirely.
So “the unions said, okay, we can’t get this on a national basis, which we think is the most equitable, rational, cheapest,” Lichtenstein said. “We’ll link it to the job.”
Job-linked benefits like health insurance rose during World War II, when inflation made employers reluctant to raise wages — so they added benefits instead. “Perks” like health care were also a way to keep workers happy so they wouldn’t leave.
Meanwhile, in 1938, the 40-hour workweek was enshrined in labor law, putting an end to six- and seven-day weeks for many workers and requiring employers to pay overtime for anything above 40 hours. For some, the American job became a one-stop shop where they could get many, if not all, their needs met — all on a (relatively) reasonable schedule.