The University System Isn’t Going Anywhere

Jaredd Craig/Los Angeles, California

The university is a strange creature. It’s a nexus of learning and power. It provides minds to fill elite positions, and yet eludes any kind of official political control in modern Western countries. It embodies both the establishment and every major wave of social rebellion. Why would such a thing even exist? What kind of patron would endow his own opponents and replacements?

The answer—a very brief one—requires a glance back to the 11th century. As the feudal world governed by illiteracy and duels began to evolve into a slightly less feudal world governed by canon and civil law, words scribbled on paper were acquiring moral and legal weight, and the literate arts reemerging as mediums of power, as they had been in Rome and Greece.

The medieval upper classes, no fools to changing trends, realized that to retain power and status in this new world, their progeny would need to wield not just swords but words in the arenas of law, politics, and religion. To learn logic, rhetoric, and theology, the next generation would obviously require lessons from the current masters of these arts. Thus did the urban centers of medieval Europe witness the emergence of student-teacher guilds, called scholae.

Unorganized at first, these guilds quickly gained popularity, and with popularity came the demand to organize. Guilds gathered into a unified school—a university. In some places (such as the University of Paris), teachers gained control over curricular and degree-granting matters. In others (such as the University of Bologna), students had more control. In both cases, resources for buildings and the legal imprimatur upholding degrees came from the crown or the church, who wanted to own a stake in these new incubators of power and privilege.

Money for teachers came from the students directly, most of whom, to repeat, hailed from wealthy families. There was no set fee, so, in essence, the medieval professor lived on tips and the generosity of the rich. When we imagine a medieval scholar, we likely conjure a man wearing a large hooded gown (such as are still worn at graduation ceremonies). As the history goes, professors hung their gowns by the classroom door, and as students walked by, they dropped money into the gown’s hood. The more popular the professor, the more coins in the hood.

Twenty years ago, in November 1999, writer and professor Perry Glaser noted the emerging trend of online education and offered a tongue-in-cheek prediction about its effect on academia: it would return professors to a medieval state, he said, as they once more became academic entrepreneurs unshielded by tenure or a salary but nevertheless able to earn a high standard of living if they gained an audience. The wandering digital scholar, like the wandering scholar of old, could set up a “classroom” via whatever digital platform he liked, peddle his courses to interested students (a whole digital world of them), and charge a small fee for the course or request donations at the end of one. Whether this new twist on the old state of affairs would be a salutary or horrifying change to academia would be a matter of personal temperament.

Twenty years on, at the end of the 2010s, how has Glaser’s prediction fared? What can the prediction’s success or failure tell us about the next 20, 30, or 100 years of Western academia?

We needn’t look far to discover digital analogues to the wandering medieval scholar who peddled his intellectual wares to whoever wanted to pay money for them. Monetized YouTube tutorials, Patreon accounts, video and photo editing courses, teacher resource apps, any number of all-purpose professional development workshops, Ribbon Farm’s Refactor Camps, Other Internet’s digital media workshops, the New Centre for Research and Practice (which literally allows you to add courses to a cart before checking out with a course registration). To these examples, we might add the pay-nothing-until-you-get-a-job programs offered by Lambda School and Microverse; these online schools, helmed by tech professionals and not academics, charge fees only when students land a job with the tech-oriented skills learned through their programs.

In short, people are taking advantage of the internet’s ability to connect students with the masters of various arts—from the art of video editing to the art of elementary school curricular design to the art of digital media theorizing to the art of coding. (Indeed, an “influencer” who charges money for online books or courses is precisely what Glaser was getting at when he highlighted the internet’s entrepreneurial potential for academics.) Digital gowns hang by the digital door, and students are dropping coins into the hood via PayPal, Bitcoin, Venmo.

However, while the examples above (and many others) tap into the internet’s entrepreneurial potential, one obvious thing is missing—professors. Elementary school teachers, housewives, teenagers, IT professionals, Instagram celebrities, pick-up artists, and health and wellness experts will all be found leading online workshops or courses in the educational gig economy. But professors are few and far between. To be sure, one can find online courses taught by professors qua professors—for example, via MIT’s Open Course Ware—but as a general rule, these online courses will be free. What’s more, online university programs—free or not—have no bearing on Glaser’s prediction because they are helmed by professors and lecturers paid by the university itself, not by students via a direct monetary transfer, as in the middle ages. Jordan Peterson is the exception that proves the rule. He has amassed a large number of “students” online who pay Peterson directly, not under the auspices of an official university program. Tellingly, however, Peterson has more or less quit his post as a professor.

As a general observation, though, professors are largely absent from the educational gig economy. One reason for the absence is that most digital entrepreneurs/teachers—earning tips directly from students—sell what we can lump together as “practical” skills that hold value and bestow (potential) status without a concomitant degree or university stamp. How to Use Layer Masks in Photoshop; Video Editing Basics; Web Design Fundamentals; Getting Started with Git; Build Your Personal Brand; Project Management Practices; Web Application Development. Insofar as courses like these are the bread and butter of wandering digital teachers, they appeal to students wanting to learn specific skills that are marketable on their own or somehow beneficial within their career contexts. Academics, it seems, remain above it all.

The universities, on the other hand, do not derive the value of their programs from the intrinsic value of what they teach, but from the social status and official credentials they grant. They can afford a much larger gap between what they teach, and what the market wants to learn. Interpret this positively or negatively; either way, it’s no surprise to see academic professors stay within the academy, and avoid competing on the market which would not necessarily be interested in their wares.

Of course, Peterson’s books, courses such as Sonya Mann’s “Friends as Force Multipliers” or courses at the New Centre offer traditional scholarly material and do in fact incarnate Glaser’s idea about the wandering medieval scholar reborn as the digital entrepreneur. Indeed, the New Centre’s “parallel academia” model—which, like the medieval university, does not employ professors but merely offers a space wherein professors earn whatever coin their courses bring in—would be exactly the sort of thing to generate a new version of Western academia, one that, ironically, as Glaser predicted, looks just like the medieval version. Nevertheless, for now, monetized tutorials on YouTube earn their creators far more coin than a New Centre course does for its instructors.

It seems Glaser’s prediction is half true: people are dubbing themselves teachers and using social media to connect with students and student fees. The educational gig economy has bloomed. Why aren’t academics getting in on the action? If the New Centre’s parallel academy or Other Internet’s boutique workshops prove anything, they prove that there is a potential audience for academic topics as such, taught online, outside the confines of a traditional university program. Why aren’t there as many parallel academies popping up as YouTube tutorials? Why don’t we see hundreds of professors attempting to copy Peterson, popularizing their specialized material for the masses?

The answer to those questions highlights a flaw in Glaser’s prediction, but more importantly, they highlight crucial features of the Western university that, in my view, has played and will play a larger role in its cultural evolution than the advent of digital media.

Recall the university’s medieval history. Once student-teacher guilds organized into single universities, a question that needed answering was the question of legitimacy—from where came the power to transform a slip of paper into a degree with rights and privileges? The imprimatur for the degree ultimately came from the church or the state, who granted charters to universities. In effect, they girded the university’s slips of paper with the powers of popes and kings.

As the centuries went by, universities that survived into the modern era slowly absorbed their medieval privileges into their own “brands,” as we might say today. Kings lost their powers; universities lost none of theirs. At some point, the universitas in Oxford became the University of Oxford and no longer required a throne’s backing to grant status via slips of paper. Its degrees retained and continue to retain all the status that once came with a king’s charter or a pope’s decree. Insofar as all universities share in this status of divine or kingly power—transmuted across the centuries into secular status inherent in the university itself—the professors who teach there get a slice of that status. Add to this a salary and tenure (a modern invention) and it’s no wonder that academics see no reason to operate a side hustle online, to become pedagogical entrepreneurs like their medieval predecessors. Leave that to the housewives, teenagers, IT professionals, and Peterson-esque popularizers.

Plato critiqued the sophists for charging money for their courses on rhetoric. Contemporary academics likewise behave as though peddling material online—for direct transfer tips from students—is beneath their office. Glaser neglected to consider the inherent elitism of academia. Professors who pursue anything that smacks of a digital side hustle—from the right-leaning Peterson to the left-leaning Zizek—are considered gauche at best, rogue at worst. Professors, myself included, can earn extra money if we need it by teaching summer courses through the imprimatur of our university, which, after all, pays our salary and provides our status.

This issue of academic status raises an obvious question, one that dovetails with concerns about a “higher education bubble”—if universities lose their status and if degrees lose their status-granting imprimatur, will professors finally get involved in the educational gig economy? Glaser’s prediction ultimately rises or falls not on educational technology as a disruptive force but on the university’s ability to maintain status amid the disruption.

Which, of course, it will maintain. Universities will maintain not only their status but their monopoly on granting status via slips of paper. History indicates as much.

The system of higher education has maintained elite standing across nearly a millennium of disruptive history, as have the degrees it confers. Kings and popes have lost their power, but the prestige they once bestowed on universities (and ipso facto their degrees) has grown through countless social, political, demographic, and technological transformations. The case that the digital transformation will strike a fatal blow to Oxford, Harvard, UCLA, or even most Directional State Universities, is shockingly poor.

How will they continue to exist? The key is their role as cultural centers that draw students seeking to gain money and power—which, after all, was the reason medieval student-teacher guilds formed in the first place. Their power to grant wealthy or upwardly mobile teenagers an economic and cultural edge on the competition, in associational credentials and relationships, is much harder to disrupt than mere course content.

Like the printing press, the internet has disrupted many industries and institutions, but scant evidence suggests that it will keep universities from continuing to attract students seeking an edge, any more than the printing press did. Indeed, universities have been quick to expand their reach through digital education, primarily by hiring more instructors and paying them less—often referred to as the “adjunctification” of higher education. These non-tenured lecturers and adjuncts may be wandering digital scholars in a sense, but they still receive their pay—such as it is—from universities.

The university has survived demographic transformations, as well as technological ones. In previous centuries, the Western university survived multiple plague outbreaks that make current population concerns look like a bagatelle: 40-50% of the European population died in the 1300s, 25% of London’s and 14% of Italy’s population died in the 1600s, 20% of Moscow’s population died in the 1700s. And on and on. To disease we can of course add war—for example, the Thirty Years War destroyed 30% of Germany’s population between 1618 and 1648. The West’s population may be aging and shrinking, but the pool of potential students is and will continue to be a thousand times larger than the tiny ones upon which universities once survived and even thrived.

Those bearish on higher education may ask: well, what if the population loses faith in the university’s ability to bestow status via a degree—a loss of faith never experienced by the admittedly plague- and disease-ravaged middle ages and early modern era?

This is a more challenging question. One reason Europe’s universities maintained their reputations as the world burned around them was that very few students went to or graduated from universities. To hold a university degree was a mark of high merit and—let’s be honest—familial wealth. In America, as recently as the late 1970s, universities awarded only  about 850,000 bachelor’s degrees each year. Today, universities award nearly two million degrees each year. Doctorate degrees have likewise ballooned from 10,000 per year in the mid-20th century to 55,000 per year in the 2010s. As universities have decided to grant more slips of paper to more students, degrees have become less scarce and thus less able to bestow status and uphold reputation. Jobs that once required only a high school diploma—for example, retail management—now demand a B.A. or B.Sc. as part of entry-level hiring requirements.

To this diminution in a degree’s status we can add the related problem of student debt. Outside the Ivy League and other selective institutions, universities by and large decided to grant more and more degrees, which brought in not only more students but more dollars from tuition and/or state funding. The universities grew. And to keep up with the demand, subsidized loans, and cost of expansion, tuition also grew. It is a complex web to untangle, whether lax student loan practices or cuts to state funding for higher education are more to blame for rising tuition, but the simple fact is that as tuition costs rose to absurd heights, students kept demanding access to universities, and universities were happy to let them in. Fast forward to 2019, and the total American student loan debt stands at $1.4 trillion.

A university degree has lost much of its informal reputation as a status marker; at the same time, the cost of a degree has not only risen, but become an unbearable debt burden. No wonder students are starting to question whether or not these degrees are worth the trouble. Surely, there are less expensive ways to gain an economic and cultural edge? This is, indeed, a difficult challenge faced by the university system—a crisis of faith in its ability to grant status.

However, amid this status anxiety, many universities continue to grow and attract students. Places like Harvard and MIT don’t need to expand at all, so secure is their ability to grant status via their degrees.

Why? In the current institutional imagination, not having a degree at all means you are not smart enough to go to college, and therefore last pick for jobs. This remains a more important distinction than the distinction between top colleges vs. everyone else, so on the margin, few are able to profitably opt out of the rapidly inflating degree race. But as financial and cultural pressure mounts against mass education, we may see this condition collapse. It may be that enough high quality people opt out entirely for financial and ideological reasons, and degrees become inflated enough, that industries will stop considering degrees important.

That lack of faith will be directed primarily at small regional universities whose histories trace back no further than the late 1800s or early 1900s. Even the worst case, doomsday scenario for higher education’s near future predicts bankruptcy only for the bottom 25% of each university category (research, master’s granting, liberal arts, and so on). That means, in other words, that most universities will indeed survive the coming demographic crunch and whatever status-anxiety issues emerge alongside it.

Even without the semi-artificial credential value of college vs no college, which is becoming more and more fake, the signal value of elite selective schools is a different and much more real effect, and isn’t going anywhere. And in an ultimate sense, it’s really the top universities that matter in their academic work and effect on society. The modern trend towards universal college education is likely a temporary aberration from the much more solid underlying reality of a small core of top universities being about elite formation.

The future might well look less like a total collapse of academia and more of a hollowing out of the middle. Small niche or private schools and large global centers of academia will go on. But the middle schools which lack high enough status or specific enough differentiation will be hit. A relevant comparison might be made to the banking sector, where small and medium institutions have been eroded for a number of years due to both regulation and their inability to provide the huge pools of liquidity which many financial deals require.

This durability shouldn’t be a total surprise. The Western university has survived not only plagues and wars but many other social, political, religious, and technological upheavals. A thousand years of history have put higher education through the gauntlet, and the university has made it through every filter.

To be sure, particular universities did not survive history’s filters. For example, the University of Wittenberg—the center of Protestant reform—survived Professor Luther’s Reformation and the ensuing wars of religion but did not survive the Napoleonic Wars, after which it merged with a Prussian university. More recently, hundreds of American universities have also failed to survive the demographic and economic upheavals of the previous century. For example, the Detroit Institute of Technology—-opened in 1891—did not survive the early 1980s recession, closing its doors in 1981. Its fate was likely sealed during the demographic hollowing out of Detroit that began in the 1960s. By 1979, with fewer Detroit natives to enroll, Detroit Tech had tapped the international student market: a third of its students were foreign Iranian nationals, most of whom were suddenly forced to drop out when their visas were suspended during the Iran hostage crisis. This dual demographic and economic filter proved too much for Henry Ford’s alma mater.

Nevertheless, hundreds—thousands—of other universities have survived, thrived, and grown during America’s turbulent history. The university system’s internal ability to filter out weak performers is, if anything, a mark of its overall success. Indeed, I would be more worried about an “education bubble” if financially weak or fly-by-night schools—such as Mt. Sierra College—had not closed down.

To survive the current filters, transformations, and disruptions—particularly the question of small universities’ ability to grant status via their degrees—some schools will need to shift emphasis to those arts that, in the popular imagination, offer a doorway to status. In the middle ages, the humanistic arts were the ones in demand. Most people were illiterate or semi-literate; literacy was the rising medium of power; to become not only literate but skilled in literacy provided an economic and social advantage. From the 1100s to the late 1800s, knowing how to do things with words—how to compose a solid legal argument or to deliver a fiery sermon—ensured a degree of status and respect, or at least a shot at economic security.

Today, all the same can be said about the electronic and scientific arts. It’s no surprise that, for example, the University of Wisconsin-Stevens Point rolled out a plan last year to cut humanities majors and shift resources to career-oriented STEM fields. I expect many minor universities will adopt similar plans. However, this is an organic transformation, undergone in the face of changing economic trends, that will allow smaller institutions to survive the present economic filter. It is not at all an indication of a dying university system. Ironically, and to my larger point about the university’s stability in the face of disruption, even this lowly, regional school recently decided not to pursue its plan to cut humanities programs.

And of course, the disruptive 2010s have shown that the ancient arts of persuasion and aesthetics will continue to play a role in competitions for status and prestige. As Samo Burja and others are beginning to note, humanistic arts are useful precisely in these moments of great demographic and economic transition. If anything, I expect an emergence of interdisciplinary university programs that emphasize the importance of both STEM and the humanities.

My bet, then, is on the survival and even the growth of the university system in the decades and century to come—even as that growth demands the disappearance of smaller, financially feeble institutions. As long as people want to increase status and gain an economic and social edge, the university will continue to be one of the best places to do so, just as it has been for a thousand years. Don’t expect professors—especially tenured ones—to participate in the educational gig economy online, anymore than you’d expect Formula 1 racers to drive for Uber. Their status protects them from insecure, entrepreneurial work.

However, in fairness to people bearish on higher education, I want to offer an asterisk on my position. To do so, I’ll need to return to Jordan Peterson by way of another history lesson.

For a thousand years, amid all manner of political disruption, reformers and radicals have sought to capture the university, not to destroy or subvert it—a fact that applies to many important social institutions. Traditionally, after all, universities hold a monopoly on the “correct” interpretation of things. Just ask Copernicus. To capture the university means to capture an intellectual authority structure, to exert a persuasive force over society’s intellectual outlook.  Of course, subverting sanctioned academic interpretations might be an early tactic adopted by reformers, but what they really want is to gain a foothold in the university, so that their own interpretations gain assent and political weight.

Before they win access to the university and other centers of power, radicals will often replicate what goes on in a university in their own shadowy educational spheres—all those meetings and struggle sessions in bohemian flats. The existence of this discourse offers a direct threat to the university’s monopoly on interpretation. This is why academics have a tendency to belittle knowledge not produced in a university; so, radicals often rely on new media—new information technologies—to facilitate their parallel discourse. The Protestant Reformers provide the obvious template. Martin Luther and his followers used the printing press to spread a very different interpretation of the Bible than the one taught in orthodox universities and seminaries. One can imagine the world of printed Protestant tracts and illicit Protestant meetings as a sort of technologically enabled, parallel university system. However, as they gained political power, the Reformers did not stay outside the system but took control over those sanctioned intellectual spaces—the halls and towering buttresses of Europe’s universities and seminaries.

Since the Reformation, one can find any number of similar iterations—from the Copernican to the socialist revolutions—of this fight between Established Orthodoxy and upstart radicals over the “correct” way to interpret the world. Like Lutherans and their printing presses, the upstarts have often used new media technologies to spread their heterodox ideas. However, from the Copernicans to the Leninists, once the upstarts gained political ground, they also gained intellectual ground as they slowly captured the universities and ipso facto claimed status as the new established orthodoxies. However, on either side of revolution—and this is the key point—the university remained the space where “correct” interpretations of reality were manufactured. Who did the manufacturing was simply a matter of who controlled the universities.

What the Lutherans did with the printing press any number of heterodox outsiders—from Jordan Peterson to Richard Wolff and a whole host of non-academic intellectuals—are using digital media to replicate university discourse outside sanctioned universities, thereby challenging the academy’s monopoly on reality interpretation. These projects regularly interface with wider parts of online discourse outside of academia, such as Peterson’s appearances on Joe Rogan’s widely viewed interview show, or the lively blogosphere and social media discussion groups. Granted, an anti-college position is part and parcel of the blue collar ethos adopted by men like Rogan and other populist challengers to official interpretations.

However, as Wael Taji has noted, the American university is becoming less attractive not only to people for whom it was never attractive in the first place, but also to potential scientists and intellectuals as well. Over the past fifty years, the capture of university departments by social activist professors—especially in the humanities and social sciences—has coincided seamlessly with the capture of university administration by corporate technocrats. More and more departments are restructuring their programs to address issues of race, class, and gender from an activist perspective; at the same time, universities have been eroding tenure and outsourcing labor to adjuncts or lecturers, who are paid less and given few or no benefits.

Both the accelerating default leftism of academia and the attrition of full-time faculty jobs have had an effect in driving some thinkers to adopt other means of exploring, expressing, and circulating their ideas—either by pursuing work in the rising Chinese academy, as in Taji’s case, or by abandoning academic life altogether, as in Jordan Peterson’s case. Either way, once bright minds start shunning the American academy and heading for alternative climes, it won’t take long for intellectual capital to sniff out the trend and begin flowing in another direction. It wouldn’t take long—perhaps no more than a generation or two—for the American university to lose its global standing as an incubator of scientific and intellectual talent.

For the Western university system to decline and crumble, its status as a status-manufacturer would need to decline first. Intellectually or scientifically ascendant outsiders, who have not just the desire but the ability to challenge academic orthodoxy, would need to deem other spaces of information exchange as the better pulpits from which to interpret reality—better, at any rate, than a graduate assistantship or a tenured chair at a university.

But those currently on the bleeding edge of university opposition are still fundamentally populists. They are not trying to set up an alternate center with higher intellectual quality, but are defined by their oppositional underdog act to the current center. The university will remain central as long as its would-be competitors continue to act through the vector of populist resentment, rather than an authoritative re-founding of those institutions which grant means and status. And if they did take that latter path, like past reforms, they would likely choose to capture the university, rather than sweep it away altogether.

If we see outsiders challenging and winning against academic orthodoxies but showing no interest in taking over the actual grounds and systems of the university, then and only then would the university as such be faced with its greatest existential threat—cultural irrelevance. But that is unlikely. Even in the most revolutionary scenarios, the gain from capturing the university will outweigh any desire to usurp its role. Much change is possible, in both form, scale, and content of the university system, but through it all, these great institutions will endure.

Seth Largo holds degrees in linguistics and rhetoric. A native of Los Angeles’ suburban sprawl, he is now an English professor on the high plains.