How Work Became a Job

Alfred T. Palmer/Inglewood, California

If you want to experience a meaning crisis, it’s hard to beat being an American white-collar worker today. To do menial labor as part of some great project is tolerable, which is why every Silicon Valley startup needs to say that it’s saving the world. But most of us are not saving the world. We’re manipulating derivatives, or tracking SEO metrics, or litigating, or some other goal as mundane as the work itself.

To sour the deal even further, long periods of sitting and staring at screens, set schedules, and long commutes (when most of us did such a thing) conspire to minimize our private time—a term which already itself reveals the degree to which life is divided into job versus non-job. When leisure does finally come around, we often still sit and stare at screens, only without the pressure to appear productive. Relief means loosening your collar—not regaining agency.

In response to the unhappiness in this job market, another market intervenes. The internet abounds with ads for courses that will supposedly teach you how to break out of the 9-to-5 world and start making money on your own—usually by doing something just as redundant and “fake” as what you’re already doing: retail arbitrage; or some kind of consulting. We can imagine a never-ending chain of online salesmen selling courses to other salesmen. This relationship to money rarely crosses paths with real material production. It gives rise to an ouroboric maze of exchanges which have high monetary value but dubious real productivity. Life in the service economy and in the online “hustle” look remarkably similar.

The corporate drive to overcome this meaning crisis is often misunderstood. Companies strive feverishly to tell you that what you are doing is in fact meaningful and that the people you’re doing it with are like a family. Pizza parties, company events, and saccharine emails about what a great job everyone did on the latest widget-process implementation report: rather than convey sincerity, the lady doth protest too much. How many employees have sat through such an affair wondering, well, if the company can afford the pizza and snacks and social events, why can’t it instead pay me enough so that my rent doesn’t eat half my income? Why is this company so concerned with making me feel comfortable and yet so uninterested in inspiring genuine loyalty?

Critics of this kind of corporate “work family” tactic often say that work and home need to be more separate, not less. One’s professional life should simply be a way to support one’s personal life, not a part of it. Forget “making a difference,” forget “team-building”—let’s just cut the nonsense, get the job done, and get the hell out. Compared to the paradigm of “playing ball,” brown-nosing, and faking a smile during redundant meetings, this may sound almost revolutionary—or at least like a stroke of sensibility: we should stop pretending work is something it isn’t.

But really, it is a profound resignation. It is to say that one is content to devote forty, fifty, or sixty hours of one’s life every week to something that fulfills no higher calling, fosters no sense of community, and has no connection to the rest of one’s life beyond remuneration—the logic of cold hard cash substitutes only for what at least for now is the pretense of a home or a community.

We could say that this speaks to a failure of imagination, but it speaks just as much to a lack of historical perspective. Traditionally, work has been almost inextricable from the rest of life. Seen from this point of view—the producer’s point of view, as opposed to the consumer’s—the call for coworkers to be like family and for one’s work at Acme Corporation to be important no longer seems cloying, stupid, or frustrating. It’s heartbreaking. It looks like the last and paltriest attempt at performing a vital and neglected function.

It is perfectly possible, of course, that there is no way for us to restore traditional conditions of work, or even to provide a moral equivalent for them. Maybe our best hope really is some kind of workaround, sprawling patchwork of case-by-case next-best things. But before we can even resign ourselves to that, we need to get a better understanding of what the traditional situation of work really consisted of, how it was dismantled, and what people actually thought about it at the time.

What was lost as production mechanized and consolidated? Can it be restored, or can something at least be put in place that cultivates the same virtue? Or do we need a new ethic-of-work altogether, one that accepts work as inevitably dissociated from our sense of fulfillment? These questions have all been asked, and almost our entire discourse about work today lies within the space carved out by only one set of answers. If we want to understand work in the early 21st century, we need to learn from the perspectives of the 19th—from a world before the great wars—when these questions were still considered open.

The Breakdown of Pre-Industrial Labor and Life

First, let’s establish the baseline. Even well into the 19th century, most Americans did not work for a wage. They were typically farmers or artisans. Before advanced mechanization of industry, production had to take place on a scale determined by normal human capacities for technique, coordination, and social relations, instead of by the productive output of machines. In such an environment, a vast network of institutions mediated between the state and the individual. The family, the local church, and other associations performed the functions of raising children, tending to the sick, and generally making things work. The family typically participated or assisted in day-to-day production.

In this world of cottage industry, there was little in the way of “outside” institutions to which these things could be outsourced. The tanner, the tailor, the cobbler, the farmer, the printer, the shipbuilder were part of communities inseparable from these conditions of production. “Community” was not a consumer good that one could voluntarily return home to after a day of work away from home. It was where work and home alike had a shared context, with all the freedom and all the limitations that could only come along with it.

The Industrial Revolution allowed for capital and labor to consolidate on a scale formerly impossible. And this is precisely when we see “capital” and “labor” begin to be reified as opposing interests. In an economy of yeomen and small producers, capital and labor are almost inseparable; under the new industrial conditions, there was a class of people associated with each. Former craftsmen and their descendants now found themselves competing with, or joining the ranks of, a new class of wageworkers and renters—people who expected indefinitely to live in a more or less unpropertied state. In other words, there was now a proletariat.

The early labor movements were characterized by their opposition to proletarianization. Craftsmen and small farmers wanted to preserve their way of life, and they formed associations dedicated to fighting for this cause. As the late historian and social critic Christopher Lasch notes in The True and Only Heaven, opposition to wage labor was widespread in the first half of the 19th century: “the general uneasiness about the new economic order found its most striking expression in the nearly universal condemnation of wage labor.” Lasch recounts how prominent American voices explicitly linked wage labor to the institution of slavery.

The notion that the decreasing value of labor would leave an entire class utterly dependent on capital owners was a noxious one. The famed New England writer Orestes Brownson proposed that if wage labor were tolerated, it could only be under the condition that laborers owned their own shop or land by the time they reached an appropriate age.

The insistence that wage labor could be cautiously managed and controlled was revealed as the great coping mechanism of the age. As the 19th century went on, it became impossible to deny the formation of a proletarian class and its rival the bourgeoisie. Along with this development came the frameworks for class analysis that we still associate with these terms: Marxism.

Marxism shared with liberal capitalist ideology a dismissal of the early labor movements as backward and provincial—because they not only resisted material progress but moral progress too. Early trade associations fought the erosion of the artisan class and its transformation into a proletariat and their members tended to be culturally and religiously conservative. This may seem puzzling in light of recent history, but labor opposed progressivism then. For the struggling artisans of the 19th century, there was no contradiction between labor activism and cultural conservatism. They were decidedly opposed to any “progress” or “improvement” that rendered their way of life obsolete.

Marxism and liberalism, on the other hand, considered this kind of progress both desirable and inevitable. Marxism stood apart in that it insisted that the factory system—the system of wages and of large-scale production—would in turn furnish the infrastructure for the next stage in history, wherein the proletariat would seize these now consolidated means of production. Like liberalism, it had little concern for the virtue of small-scale proprietorship at the heart of what the early labor movements strove to defend.

But if there was no going back to the old economic order, if factories were here to stay, then at least there could be a moral equivalent to this kind of ownership. Lasch notes the thought of the French syndicalist Georges Sorel, who proposed that collective worker control would restore the craftsman’s pride which existed in small-scale ownership. Sorel highlighted art as “an anticipation of the kind of work that ought to be carried on in a highly productive state of society.”

While conservatives fetishized the dignity of property as such, Sorel believed that it was the opportunity for invention and experimentation that ennobled smallholder life. He saw the displacement of capitalist profit-seeking by worker control as the means by which to restore this in factory life.

The Idyllic Consumerism That Failed

The liberal alternative to Marxism was to insist that while capitalism might dispossess certain producers, it begins to address their needs and desires in their role as new consumers in a mass market—and has historically gotten better and better at doing it. In The True and Only Heaven, Lasch takes great pains to show that while Christian theological views of history as a prelude to the kingdom of God certainly influenced Western thought, modern secular narratives of progress rest on content that departs from this tradition.

According to Lasch, the truly distinctive thing about progressive ideology is “the exemption of the modern world from the judgment of time.” Previously, Christians and earlier pagans alike understood judgment “to hang like a sword over all man’s works.” That worldview held that it was in the cultivation within a person of limitation upon the passions that we find our freedom. After all, human existence and the tangible world we inhabit are inherently circumscribed, beset by limits. Time and again, this gives rise to what Lasch calls “the question that haunted our ancestors: how should nations conduct themselves under sentence of death?”

The ideology of progress stops us from asking this question, because it assumes—and after the Industrial Revolution, had no problem summoning evidence—that what the growth of human appetites really does is not to effect moral decrepitude but to foster what Lasch calls “the indefinite expansion of the productive machinery necessary to satisfy” these increasing appetites.

We might say that the real kernel of our notion of progress is a kind of anti-Stoicism: human freedom consists in an indefinite expansion and proliferation of human wants and desires, and ultimately this is good. “A positive appraisal of the social effects of self-gratification” made possible the belief that our age no longer had to confront the “doom of threatened societies,” in Richard Niebuhr’s words.

Through this lens of progress as continued growth of appetites, it’s easy to see why it is the consumer, as opposed to the producer, who is so crucial to the anthropology of modern life. Indeed, in the 20th century it became established that consumption—the inconspicuous consumption of the common people even more than the extravagances of the wealthy—would pave the way for the uplift of workers. Reforms would be able to reduce socioeconomic extremes and accommodate the pursuit of happiness by bringing workers into the role of consumers who could exercise an expanded capacity for choice. To insist on finding meaning in dignified work was to pine for an abolished past; satisfaction and contentment could be found instead in an abundance of goods and services that more and more people could afford—and in a newfound room for leisure.

Lasch saw both the progressive movement and the New Deal as institutionalizing the consumer, with the goal of maximizing leisure rather than experimentation or property ownership. Technology would make labor increasingly efficient and decrease the amount of time that consumers had to commit to this unfortunate activity. New Deal supporters like Rexford Tugwell and Simon Patten saw the “nostalgic” desire for craftsmanship as tied to, in Patten’s words, the “philosophy of development through pain,” and the moral “art of wretchedness.”

The rise of Keynesianism in the later New Deal gave these economists an opportunity to “correct” against the focus on production—characteristic of the early New Deal programs—in favor of consumption. The moral and economic ideal was a community of consumers which viewed work as a necessary evil and invested in ways to minimize it or banish it altogether.

In fairness to this vision of consumerism, we probably would not be as unhappy if work truly did occupy as vanishing a segment of our weekly lives as this company of experts would have wished. Who could oppose leisure as an opportunity for genuine play and for “labors of love” freely undertaken, rather than an occasion to “shop til you drop” or to zone out in front of a screen? But the rosy optimism of consumer paradise has not come to pass.

Something is missing from this picture of human happiness; something standing in the light of progress casts an incorrigible shadow. Human beings do crave comfort, ease, and abundance, but we also crave honor, struggle, and worthy causes. We might argue that the global success of the American project, and its remarkable bifurcation of domestic political discourse, can be summed up as its ability to channel this latter set of human needs in service of the former. In any case, the progressive centering of consumptive man reflects an inadequate appraisal of our psychological and spiritual anatomy. Lasch proposes that the consumerist vision’s most attractive aspect is also its greatest weakness: “its rejection of a heroic conception of life.”

Both in Christian pre-modernity and in totalitarian reactions to it, a conception of great moral consequences marks everyday actions with “cosmic significance.” Even American liberal democracy managed to imbue the life of the ordinary citizen with such significance at least for a time. For a brief period, there really was something like a moral equivalent to small-scale proprietorship. In mid-20th century America, management was an activity engaged in by everyone in a company from the CEO to production workers. Global corporate powerhouses like McDonald’s and Safeway were led by people who had started as ordinary service personnel. The social ties between different levels of corporations ensured collaboration between them, reflected in worker wage growth which even outpaced that of top executives.

What is most striking about this account of the mid-century corporate world is not just its respect for the value of pride in workmanship but also how alien this perspective seems today. To anyone who began working after the heyday of lifetime employment and widely distributed middle management roles, such a paradigm sounds almost too good to be true.

Instead, university graduates sift through job listings for “entry-level” positions that demand years of experience. A massive web of regulations and credentials—both of which are increasingly expensive to comply with or to obtain—blocks many people from entering professions that might fulfill the need for a genuine calling or even that for adequate income. Wages, perhaps most importantly, have not risen to keep pace with the fast growth of prices in housing and education.

Managerial Betrayal Laid Bare

The rise of management consulting transformed the model of American corporate organization from one of widely distributed management functions into one that concentrates these functions in a small number of top-level roles. Meanwhile, many former managers and production workers are hired back in as much less secure subcontractors; union representation has plummeted; and part-time employment, with its attendant lack of benefits, has been normalized.

The concentration of management has also meant that its payoff in dollars, once diffused throughout the corporate chain, has been concentrated as well. The mid-century era of American dynamism and innovation was not one where the typical CEO of a large company made nearly 300 times as much as a production worker. Rather than being competent representatives of their particular industry, top executives have become structurally isolated from the workers they oversee. In place of a diverse set of goals based on their particular company, CEOs began to share the universal goal of making money for shareholders.

The much-parodied fakeness of employee engagement initiatives flows from this alienation of corporate life. It provides the fodder for movies like Office Space and shows like The Office. Much of the humor and relatability of these productions comes from the fact that office culture, and the atmosphere of the workplace generally, has become so uniform that one company’s version of “flair” is almost interchangeable with another. When top positions are filled by new hires with credentials but not experience—while those working in lower positions are so without real agency within the organization—attempts to “engage” workers cannot help but converge on these tropes.

A 2014 Gallup poll found that less than a third of U.S. workers were engaged in their work—perhaps unsurprisingly, the highest levels of engagement were found among managers. Even among managers the engaged were a minority, suggesting that the greater responsibilities and even greater incomes associated with management can only do so much to offset the overall condition of a consumptive way of life. A more recent survey that found record highs in engagement still reported the number saying they were engaged was still less than half.

Efforts at personalizing the work environment do not make up for the emptying of genuine loyalty from institutions, and the trend of companies asking for prospective employees to “go above and beyond” before even receiving compensation—even dismissing their demands for adequate compensation as a sign of a bad attitude—rings like a bad joke.

These changes in management structure have been justified on the basis of meritocracy. But what constitutes merit, and constitutes a just reward for merit, depends on context. A better way to express what has happened is that loyalty and status have been decisively decoupled.

Certainly, we can consider the willingness of a company to fund workplace training, promote from within, and offer lifetime employment, and the willingness of a worker to undertake the corresponding efforts on the other end, to be reciprocal elements in a relationship of loyalty. The quasi-abolition of such an arrangement, then, can without difficulty be called betrayal. And those implementing this new ideological model of corporate governance—not even consistently having the pretext of profitability to justify it—could expect reprisal. When IBM implemented mass layoffs in the Hudson Valley in the 1990s, local officials asked gun stores to temporarily close as the community reeled from the impact.

This divorce between the payoffs of status and income on the one hand, and the reciprocal obligations of loyalty on the other, gives us a curious case of the high and low versus the middle. Those at the very top of the corporate hierarchy can afford to eschew loyalty—except to the shareholders’ bottom line—and those who perform unskilled labor have no choice but to do so, since they often have to maintain more than one job, or a succession of short-term jobs, in order to make an adequate income. At the top one is trained for rapaciousness; at the bottom, for precarity. In both cases, one is untethered.

But those in the middle, stuck in the pipeline of middling credentials and half-spurious requirements for qualification, in a sense have the fewest options of all. Being considered “skilled” offers them the potential for a greater income, but it also makes it much harder for them to change industries, since companies insist on “relevant experience” and often have highly specific criteria for hiring. Yet at the same time, the former incentives for staying put, the paid training and lifetime guarantees once enjoyed by workers, are long gone. Those in the shrinking middle class are in effect stuck planning for a future that neither those above them nor those below them have any incentive to invest in.

Life as a Service

The very structure of the working world plays into another effect of modern progress, what Lasch calls “the waning of the sense of historical time.” Images of the past and of possible futures are abundant, but are dealt with as just that—snapshots of aesthetic sensibilities and unfamiliar customs, presented for our entertainment or to make a moral point about the oppressive conditions of previous ages. We inhabit an increasingly narrow sliver of historical time; to look back even ten years is to look at a bygone world with which we find it difficult to identify. We find it as hard to empathize with the hope previous generations had for the future as with their respect for those that came before them.

Just as the past and future have been reduced to frozen images, so too have “ownership,” “hard work,” and “being in it for the long haul.” These have been reduced to memetic phrases by which companies sell their culture to workers. We might say this is the culmination of the reduction of the human spirit to a state of anxious passivity: that even the prospect of meaningful productive work is itself a consumer product to be traded on the market.

Forget the outsourcing of jobs overseas—we have outsourced not just the manufacture of goods but every function of human life, from the telling of stories and the transmission of culture, to the rearing of our children and the care for our elderly, up to and including the very sense of undertaking a meaningful task.

This is LaaS: life as a service. We are so far from being producers and so deep in the mental and social setting of the consumer that even the feeling that one is a producer is just one more service to be paid for. Our sense of reality is inextricable from our sense of agency. With so little to account for the latter, we are cornered: viewing the world and our very lives, not with the high indifference of an enlightened being, but with the detachment of asymmetrical spectators. We have enough skin in the game to keep us locked in, but not in a way as to enjoy many of the benefits.

And we find relief from being in this state of uneasy spectatorship in things that serve to reinforce it: endless scrolling through algorithmically generated interest, mass entertainment productions, and “going out” in order to satiate the appetites of consumption in a more directly physical sense. We are thoroughly incentivized to take these avenues of mental escape and nearly prohibited from virtuous presence.

Still more obstacles block what remains of the worker’s capacity to achieve productive agency. In one famous case, this forced farmers to hack their tractors in order to make routine repairs on their own. Once again, there is an asymmetrical relationship of dependence: you as a client or a customer may be treated as fungible by an institution, but the institution enforces non-interchangeability in the other direction. You may “own” a tractor, a smartphone, or an automobile, but as far as the ability to make upgrades or repairs is concerned, it is still owned by the manufacturer. The very institution of private property has been bifurcated: large corporations can be said to “own” property, but for the individual and for the family what is referred to as ownership increasingly amounts to a protracted form of rent.

The Goodness of Life in the Face of Its Limits

The fakeness of work, and of daily life, is not a timeless fact. It is a product of historical circumstances and events: the rise of an ideology which champions the consumerist aspects of the human being over the productive; the Industrial Revolution, which required human labor to orient itself so as to imitate machine labor; the concomitant demise of small-scale craftsmanship and bifurcation of capital and labor; the rise and fall of the “company man,” a phenomenon which did provide for a time a moral equivalent of small proprietorship; the expansion and inflation of expensive credentials, qualifications, and proprietary dependence; the proliferation of entertainment media which provide a passive escape from the paradigm rather than a challenge to it; and the curiously anachronistic attachment to certain restrictions and incentive structures.

There is no quick fix for these issues. No one has the power—or, frankly, the incentive—to cut the Gordian Knot. A system with coherent structures of command and incentive, and which took an active interest in the health and success of its people, could pursue a number of options to address the crisis of loyalty. It could rebuild labor power and representation in corporate life, use federal contract requirements to diffuse managerial functions, gut regulatory and corporate ownership hurdles to small proprietorship, incentivize domestic manufacture, and so on.

Each of these includes its own set of bad incentives and political battles. But to ensure that any or all of these goals is worked toward properly, our state and society would have to jettison the ideal of the consumer-citizen for a more promising vision of human fulfillment. In response to these conditions, Lasch offers a way forward:

Progressive optimism rests, at bottom, on a denial of the natural limits on human power and freedom, and it cannot survive for very long in a world in which an awareness of those limits has become inescapable. The disposition properly described as hope, trust, or wonder, on the other hand—three names for the same state of heart and mind—asserts the goodness of life in the face of its limits. It cannot be defeated by adversity. In the troubled times to come, we will need it even more than we needed it in the past.

Aaron Jacob is a writer based in Texas.