More than a numbers game

There is a labor crisis in higher education.

The myth of the well-compensated, insulated, and out of touch professor has a powerful grip on the American imagination, but in fact applies only to a few people, even among those lucky enough to have a tenured position. (The real money comparatively speaking is in administration, unless you happen to be a coach.) Most professors, including those on the tenure track, are not well-paid, particularly relative to their level of education. Setting that issue aside separate, albeit related issue, the larger crisis is that courses are increasingly being taught by adjunct professors with too little pay, no benefits, and no job security.

This is not new. The old line was that you should inquire how much of the teaching at a school is done by graduate instructors, and adjuncts are the latest iteration of the same forces that cause schools to fill credit hours with cheap labor.

In the sense that many, though not all, schools have bi-polar mission of teaching on the one side and world-leading research from their (full-time) faculty on the other, this split makes sense. As much as research influences teaching and vice-versa, both take time to do well. In the humanities, too, research generally doesn’t make money, but remains a benchmark for the university on various external rankings, which, in turn, is part of the pitch to bring in students. The solution is generally to bring in cheap labor to fulfill the teaching mandate, thereby creating a surplus that can be paid to the full-time faculty in the form of salary and research support, including travel and reduced teaching loads. Simple.

Only not so much. With state divestment from higher education, the financial burden for operating a university is frequently being passed on to the students, now branded as the consumers, in the form of tuition, fees, and, eventually solicitations for donations as alumni while they are still paying off loans for the initial investment. And at the same time, significant teaching loads are passed to underpaid and overworked contingent faculty. This is not to say that contingent faculty are bad teachers—-many are excellent—-but that while the cost to the student goes up the combination of financial precarity and insufficient resources impedes the ability of many of their teachers to help them reach their potential. Something like 75% of all faculty teaching in colleges are now non-tenure track positions, working under a range of titles and for a median pay of 2700 dollars per course.

These economic issues are fundamentally important to the future of higher education, a top-heavy system that feels many days like it is teetering precipitously. It is a matter of when, not if, something is going to give.

But that is not what prompted this post.

In response to a recent report on the issues surrounding contingent labor and a report that 79% of anthropology PhDs do not gain employment in tenure-track positions, I saw the inevitable response that the solution to this problem is to reduce production of PhDs. The idea is that this is a crisis created by supply far outstripping demand, which is true enough, but doesn’t acknowledge the underlying structures that are shaping demand.

The optimistic, if morbid, line even when I started graduate school in 2009 was that it was just a matter of waiting for the rapidly aging generations of professors to give up their posts one way or another. Not that the job market would be easy, but that there would be a wave of jobs that would make it easier. Before long it became apparent that the great recession of 2008, which struck right as I was graduating from college, marked an inflection point for higher education. Many of those older faculty members were clinging to their jobs not out of malice, selfishness, or obliviousness, but because they believed that their positions would not be replaced when they left. They were right. Their courses are taught by contingent faculty and the tenure lines largely boarded up and forgotten. This is the new normal.

These systemic changes are not unique to higher education, I should add. I’ve recently been reading Sarah Kendzior’s A View From Flyover Country where she talks at length about the seismic changes to the American economy after 2008 as companies looked for ways to remain profitable to stockholders. Universities are a little bit different because many schools are among the institutions most affected by government divestment, but there are many broad similarities.

Nevertheless, I am not in favor of a widespread slashing of graduate programs.

First, reducing the number of PhDs is not going to solve the labor crisis. There is already a long line of qualified candidate. In 2012, two schools, Harvard University and the University of Colorado received backlash after stating in the job ad that candidates more than a few years after graduation need not apply. Moreover, cutting positions in graduate programs does nothing to address the structural factors underlying the decline of tenured positions. In fact, cuts to graduate programs could conceivably accelerate the cuts to full-time positions because graduate programs are one of the justifications to keep tenured faculty.

Second, the remaining graduate programs would invariably exist in a handful of elite schools, which already produce most of the graduates who win the tenure-track job lottery. This list of elite schools is not immutable, but tends to favor those that already have large endowments. As is true elsewhere in American society, fluctuations to financial fortune tend to be much larger for schools without these inheritances.

In theory, limiting graduate education to wealthy schools would create a more ethical environment in terms of pay for graduate students, as well as provide them adequate research support, but it also develops scholars and teachers in an environment radically different from where most professors work—not to mention that their students will be coming from. Like with my comments about adjuncts above, this is not meant to denigrate people who go through elite institutions, many of whom are deeply concerned with issues of precarity, austerity and who do not come from privileged backgrounds. At the same time, reducing spots reduces the opportunity for people who are not already introduced to academic life, either during their undergraduate education or through individual mentor-ship, usually by someone with connections to those schools. Similarly, for as much scholarship comes out of people working in top-tier programs, they cannot cover everything. As in any number of fields, visibility and representation matter. A retreat toward the proverbial ivory tower reinforces the perception of a barrier between the intellectual elite and everyone else.

There are deep ethical issues with how graduate program in the humanities approach training, regardless of what the future of the professoriate looks like. There needs to be greater acknowledgement and preparation for so-called alt-ac jobs, and a support system in place to help people find employment with livable wages. That is, there is needs to be a reconsideration of the purpose of graduate school, with teaching in college being just one potential outcome.

(To be fair, this is easier said than done and I see programs coming to grips with this reality and beginning to implement changes, but too little and too slowly, and without enough action to counteract the emotional trauma of the current system.)

But there is also a larger point. People pursue advanced degrees for all sorts of reasons, including interest. This is a good thing. I may sound impossibly, naively idealistic, but I want to live in a society that supports and values education not out of a desire for credentialism but because these opportunities are where creative innovation is born. Eliminating graduate programs beyond those in well-funded schools makes sense if you look at the problems facing higher education as a simple supply-and-demand numbers game, but in fact threatens to realize some of the worst stereotypes about academia.

The Bagel, Maria Balinska

Sometimes when Amazon reviewers give low marks to a book the comments indicate that a book is not good. Sometimes the comments reveal that the Person Angry on the Internet didn’t actually read the same book that the author wrote. Sometimes the reader understood the book but is just angry that it isn’t the book he or she wanted. The last scenario is true of Maria Balinska’s The Bagel, which the reviewer lamented was principally a history of Jewish labor history, rather than a history of the eating of bagels. This is a valid observation, though Balinska does her best to lay out what evidence there is for how bagels were consumed, too.

Balinska starts with an overview of what she considers to be related breads from China to Italy, all wheat breads (distinct from rye, barley, oat, etc) made into dense loaves that go stale quickly, are usually eaten by dipping in tea or other hot liquids, and are baked into rings. One of the closest relatives to the bagel is the pretzel, with the three holes taking on religious significance. Balinska traces the bagel from medieval Poland, where it diverged emerged from a Polish wheat ring bread owarzanek, a luxury in a region that mostly produced rye flour, but one that was a Sunday food because it was associated with purity. The bagel separated from the Christian version by being boiled when the Polish monarchy issued restrictions against Jewish bakers making owarzanek.

The story crosses the Atlantic in the 1880s with the waves of Jewish immigrants and is wrapped around the labor politics, food safety standards, and anti-immigrant sentiments in the subsequent decades. Despite the complaint lodged in the Amazon review, this was the most interesting and strongest part of the book and one that I want to use should I ever find myself teaching the second half of US history. The stories about the conditions in these bakeries make me thankful for food safety standards, and the labor upheavals mirror the more well-known industries. The 1905 Supreme Court Case Lochner v New York, which ruled that the government could not limit the hours people worked, was brought by a bakery. At the NY bagel baker’s union’s height, Balinska argues that it was the shape and density of the dough, which defied mechanization, that gave the union power.

Balinska concludes the story by recounting how mechanization and big business in the form of Lender’s Bagels led to the Jewish bread conquering the United States. Frozen bagels made them last longer (fresh bagels earlier had a tendency to go stale in a matter of hours) and they became a readily available convenience food for homes and hotels alike.

The Bagel is an engaging read, though Balinska’s specific narrative is how special Jewish food in Poland became ubiquitous in America gives short shrift to the story of bagels in Montreal and tends to be somewhat reductive in order to trace this narrative. For instance, the existence of Bagel traditions in Florida, Buffalo, and those in New York run by organized crime are only accounted for in terms of the challenge they presented to the proliferation of New York style bagels. Being more comprehensive is impossible in a book so short, but what does appear hints at a larger, richer, and more complex story out there. The Bagel was published in 2008 and I was left wondering if, like other consumable products, there is an addendum to the big business, moderate quality climax–one where there has emerged a decentralized, artisanal bagel movement.