There is a labor crisis in higher education.
The myth of the well-compensated, insulated, and out of touch professor has a powerful grip on the American imagination, but in fact applies only to a few people, even among those lucky enough to have a tenured position. (The real money comparatively speaking is in administration, unless you happen to be a coach.) Most professors, including those on the tenure track, are not well-paid, particularly relative to their level of education. Setting that issue aside separate, albeit related issue, the larger crisis is that courses are increasingly being taught by adjunct professors with too little pay, no benefits, and no job security.
This is not new. The old line was that you should inquire how much of the teaching at a school is done by graduate instructors, and adjuncts are the latest iteration of the same forces that cause schools to fill credit hours with cheap labor.
In the sense that many, though not all, schools have bi-polar mission of teaching on the one side and world-leading research from their (full-time) faculty on the other, this split makes sense. As much as research influences teaching and vice-versa, both take time to do well. In the humanities, too, research generally doesn’t make money, but remains a benchmark for the university on various external rankings, which, in turn, is part of the pitch to bring in students. The solution is generally to bring in cheap labor to fulfill the teaching mandate, thereby creating a surplus that can be paid to the full-time faculty in the form of salary and research support, including travel and reduced teaching loads. Simple.
Only not so much. With state divestment from higher education, the financial burden for operating a university is frequently being passed on to the students, now branded as the consumers, in the form of tuition, fees, and, eventually solicitations for donations as alumni while they are still paying off loans for the initial investment. And at the same time, significant teaching loads are passed to underpaid and overworked contingent faculty. This is not to say that contingent faculty are bad teachers—-many are excellent—-but that while the cost to the student goes up the combination of financial precarity and insufficient resources impedes the ability of many of their teachers to help them reach their potential. Something like 75% of all faculty teaching in colleges are now non-tenure track positions, working under a range of titles and for a median pay of 2700 dollars per course.
These economic issues are fundamentally important to the future of higher education, a top-heavy system that feels many days like it is teetering precipitously. It is a matter of when, not if, something is going to give.
But that is not what prompted this post.
In response to a recent report on the issues surrounding contingent labor and a report that 79% of anthropology PhDs do not gain employment in tenure-track positions, I saw the inevitable response that the solution to this problem is to reduce production of PhDs. The idea is that this is a crisis created by supply far outstripping demand, which is true enough, but doesn’t acknowledge the underlying structures that are shaping demand.
The optimistic, if morbid, line even when I started graduate school in 2009 was that it was just a matter of waiting for the rapidly aging generations of professors to give up their posts one way or another. Not that the job market would be easy, but that there would be a wave of jobs that would make it easier. Before long it became apparent that the great recession of 2008, which struck right as I was graduating from college, marked an inflection point for higher education. Many of those older faculty members were clinging to their jobs not out of malice, selfishness, or obliviousness, but because they believed that their positions would not be replaced when they left. They were right. Their courses are taught by contingent faculty and the tenure lines largely boarded up and forgotten. This is the new normal.
These systemic changes are not unique to higher education, I should add. I’ve recently been reading Sarah Kendzior’s A View From Flyover Country where she talks at length about the seismic changes to the American economy after 2008 as companies looked for ways to remain profitable to stockholders. Universities are a little bit different because many schools are among the institutions most affected by government divestment, but there are many broad similarities.
Nevertheless, I am not in favor of a widespread slashing of graduate programs.
First, reducing the number of PhDs is not going to solve the labor crisis. There is already a long line of qualified candidate. In 2012, two schools, Harvard University and the University of Colorado received backlash after stating in the job ad that candidates more than a few years after graduation need not apply. Moreover, cutting positions in graduate programs does nothing to address the structural factors underlying the decline of tenured positions. In fact, cuts to graduate programs could conceivably accelerate the cuts to full-time positions because graduate programs are one of the justifications to keep tenured faculty.
Second, the remaining graduate programs would invariably exist in a handful of elite schools, which already produce most of the graduates who win the tenure-track job lottery. This list of elite schools is not immutable, but tends to favor those that already have large endowments. As is true elsewhere in American society, fluctuations to financial fortune tend to be much larger for schools without these inheritances.
In theory, limiting graduate education to wealthy schools would create a more ethical environment in terms of pay for graduate students, as well as provide them adequate research support, but it also develops scholars and teachers in an environment radically different from where most professors work in—not to mention that their students will be coming from. Like with my comments about adjuncts above, this is not meant to denigrate people who go through elite institutions, many of whom are deeply concerned with issues of precarity, austerity and who do not come from privileged backgrounds. At the same time, reducing spots reduces the opportunity for people who are not already introduced to academic life, either during their undergraduate education or through individual mentor-ship, usually by someone with connections to those schools. Similarly, for as much scholarship comes out of people working in top-tier programs, they cannot cover everything. As in any number of fields, visibility and representation matter. A retreat toward the proverbial ivory tower reinforces the perception of a barrier between the intellectual elite and everyone else.
There are deep ethical issues with how graduate program in the humanities approach training, regardless of what the future of the professoriate looks like. There needs to be greater acknowledgement and preparation for so-called alt-ac jobs, and a support system in place to help people find employment with livable wages. That is, there is needs to be a reconsideration of the purpose of graduate school, with teaching in college being just one potential outcome.
(To be fair, this is easier said than done and I see programs coming to grips with this reality and beginning to implement changes, but too little and too slowly, and without enough action to counteract the emotional trauma of the current system.)
But there is also a larger point. People pursue advanced degrees for all sorts of reasons, including interest. This is a good thing. I may sound impossibly, naively idealistic, but I want to live in a society that supports and values education not out of a desire for credentialism but because these opportunities are where creative innovation is born. Eliminating graduate programs outside beyond those in well-funded schools makes sense if you look at the problems facing higher education as a simple supply-and-demand numbers game, but in fact threatens to realize some of the worst stereotypes about academia.