Temps, Consultants, and the Rise of the Precarious Economy
In 1967, the celebrated economist and intellectual John Kenneth Galbraith argued in his best-selling book The New Industrial State that “we have an economic system which, whatever its formal ideological billing, is in substantial part a planned economy.”1 Though postwar American politicians juxtaposed US free markets to the centrally planned economies of the Soviet bloc, Galbraith recognized that the two were more similar than one might have thought. The private planning of corporations, whose budgets were sometimes bigger than those of governments, defined postwar American capitalism, not markets.2 Markets meant uncertainty, and postwar corporate planners eschewed risk above all else.3
After the chaos of depression and war, corporate planners had worked in conjunction with federal policymakers to make a world that promoted stability. None of the top 100 postwar corporations had failed to earn a profit.4 This profitability was not an accident. Nor was it the result of seizing every lucrative prospect. Rather, it had come from minimizing risk in favor of long-term certainty.
This postwar economy had allowed employees and employers alike to plan for the future, assuring them steady wages and steady profits. Big business had to be big to contain all the functions it would not entrust to the market. Through their own five-year plans, Galbraith argued, corporations “minimize[d] or [got] rid of market influences.”5 This American planned economy—which had appeared to be the natural future of capitalism in 1967—began to fall apart only two years later, in 1969, nearly twenty years before the fall of the Soviet Union.
The collapse of this postwar economy came from the overreach of its new corporate form—the conglomerate—whose rise was legitimated by the belief in managerial planning. But its essential moral underpinnings—stability for investment and, especially, stability for work—took more of an effort to dislodge. Yet in the 1970s and 1980s, this effort succeeded as corporations began to embrace risk and markets, undoing the stability of the postwar period. By the 1980s, the risk-taking entrepreneur had displaced the safe company man as the ideal employee.
Today, scholars and critics are all abuzz about “precarious” work. Instead of a job for life with General Motors or AT&T, we now have many jobs either in sequence or, increasingly, all at once. Freelancers in the US labor force are estimated to number around fifty-four million, as much as one-third of the work force.6 “Precarious” has become a catchall term that encompasses everything from day labor to temp work to the gig economy, and denotes flexible work that is insecure, temporary, and generally poorly paid. If the worker in this flexible economy is something new, so too is the firm, which, instead of hiring employees, increasingly outsources its labor needs.
Economists especially like to explain this shift to flexible labor in terms of the reduction of “transaction costs” (the costs of finding and hiring someone).7 One classic argument proposes that firms arose only because it was too expensive for individuals to transact every obligation. By that logic, the arrival of the Internet—with sites such as Craigslist and Upwork—easily explains the displacement of stable, firm-based jobs by the gig economy. But that explanation is too easy. The origins of precariousness in the rise of temp agencies and the fall of the postwar conglomerate run much deeper than the advent of digital platforms. The shape of our economy is made possible by technology, but it reflects a choice determined more by beliefs about the corporation than by lower transaction costs. Before they outsourced their labor, firms had to overcome old-fashioned shibboleths, such as secure employment for their work force, and they did so during the 1970s, long before the Internet arrived. ...