More than a numbers game

There is a labor crisis in higher education.

The myth of the well-compensated, insulated, and out of touch professor has a powerful grip on the American imagination, but in fact applies only to a few people, even among those lucky enough to have a tenured position. (The real money comparatively speaking is in administration, unless you happen to be a coach.) Most professors, including those on the tenure track, are not well-paid, particularly relative to their level of education. Setting that issue aside separate, albeit related issue, the larger crisis is that courses are increasingly being taught by adjunct professors with too little pay, no benefits, and no job security.

This is not new. The old line was that you should inquire how much of the teaching at a school is done by graduate instructors, and adjuncts are the latest iteration of the same forces that cause schools to fill credit hours with cheap labor.

In the sense that many, though not all, schools have bi-polar mission of teaching on the one side and world-leading research from their (full-time) faculty on the other, this split makes sense. As much as research influences teaching and vice-versa, both take time to do well. In the humanities, too, research generally doesn’t make money, but remains a benchmark for the university on various external rankings, which, in turn, is part of the pitch to bring in students. The solution is generally to bring in cheap labor to fulfill the teaching mandate, thereby creating a surplus that can be paid to the full-time faculty in the form of salary and research support, including travel and reduced teaching loads. Simple.

Only not so much. With state divestment from higher education, the financial burden for operating a university is frequently being passed on to the students, now branded as the consumers, in the form of tuition, fees, and, eventually solicitations for donations as alumni while they are still paying off loans for the initial investment. And at the same time, significant teaching loads are passed to underpaid and overworked contingent faculty. This is not to say that contingent faculty are bad teachers—-many are excellent—-but that while the cost to the student goes up the combination of financial precarity and insufficient resources impedes the ability of many of their teachers to help them reach their potential. Something like 75% of all faculty teaching in colleges are now non-tenure track positions, working under a range of titles and for a median pay of 2700 dollars per course.

These economic issues are fundamentally important to the future of higher education, a top-heavy system that feels many days like it is teetering precipitously. It is a matter of when, not if, something is going to give.

But that is not what prompted this post.

In response to a recent report on the issues surrounding contingent labor and a report that 79% of anthropology PhDs do not gain employment in tenure-track positions, I saw the inevitable response that the solution to this problem is to reduce production of PhDs. The idea is that this is a crisis created by supply far outstripping demand, which is true enough, but doesn’t acknowledge the underlying structures that are shaping demand.

The optimistic, if morbid, line even when I started graduate school in 2009 was that it was just a matter of waiting for the rapidly aging generations of professors to give up their posts one way or another. Not that the job market would be easy, but that there would be a wave of jobs that would make it easier. Before long it became apparent that the great recession of 2008, which struck right as I was graduating from college, marked an inflection point for higher education. Many of those older faculty members were clinging to their jobs not out of malice, selfishness, or obliviousness, but because they believed that their positions would not be replaced when they left. They were right. Their courses are taught by contingent faculty and the tenure lines largely boarded up and forgotten. This is the new normal.

These systemic changes are not unique to higher education, I should add. I’ve recently been reading Sarah Kendzior’s A View From Flyover Country where she talks at length about the seismic changes to the American economy after 2008 as companies looked for ways to remain profitable to stockholders. Universities are a little bit different because many schools are among the institutions most affected by government divestment, but there are many broad similarities.

Nevertheless, I am not in favor of a widespread slashing of graduate programs.

First, reducing the number of PhDs is not going to solve the labor crisis. There is already a long line of qualified candidate. In 2012, two schools, Harvard University and the University of Colorado received backlash after stating in the job ad that candidates more than a few years after graduation need not apply. Moreover, cutting positions in graduate programs does nothing to address the structural factors underlying the decline of tenured positions. In fact, cuts to graduate programs could conceivably accelerate the cuts to full-time positions because graduate programs are one of the justifications to keep tenured faculty.

Second, the remaining graduate programs would invariably exist in a handful of elite schools, which already produce most of the graduates who win the tenure-track job lottery. This list of elite schools is not immutable, but tends to favor those that already have large endowments. As is true elsewhere in American society, fluctuations to financial fortune tend to be much larger for schools without these inheritances.

In theory, limiting graduate education to wealthy schools would create a more ethical environment in terms of pay for graduate students, as well as provide them adequate research support, but it also develops scholars and teachers in an environment radically different from where most professors work in—not to mention that their students will be coming from. Like with my comments about adjuncts above, this is not meant to denigrate people who go through elite institutions, many of whom are deeply concerned with issues of precarity, austerity and who do not come from privileged backgrounds. At the same time, reducing spots reduces the opportunity for people who are not already introduced to academic life, either during their undergraduate education or through individual mentor-ship, usually by someone with connections to those schools. Similarly, for as much scholarship comes out of people working in top-tier programs, they cannot cover everything. As in any number of fields, visibility and representation matter. A retreat toward the proverbial ivory tower reinforces the perception of a barrier between the intellectual elite and everyone else.

There are deep ethical issues with how graduate program in the humanities approach training, regardless of what the future of the professoriate looks like. There needs to be greater acknowledgement and preparation for so-called alt-ac jobs, and a support system in place to help people find employment with livable wages. That is, there is needs to be a reconsideration of the purpose of graduate school, with teaching in college being just one potential outcome.

(To be fair, this is easier said than done and I see programs coming to grips with this reality and beginning to implement changes, but too little and too slowly, and without enough action to counteract the emotional trauma of the current system.)

But there is also a larger point. People pursue advanced degrees for all sorts of reasons, including interest. This is a good thing. I may sound impossibly, naively idealistic, but I want to live in a society that supports and values education not out of a desire for credentialism but because these opportunities are where creative innovation is born. Eliminating graduate programs outside beyond those in well-funded schools makes sense if you look at the problems facing higher education as a simple supply-and-demand numbers game, but in fact threatens to realize some of the worst stereotypes about academia.

First day fragments

My fall semester begins in earnest today, with the first session for both of my classes. I don’t have a single back-to-school post idea, but rather a bunch of loosely connected ones, so decided to go with a fragmentary format.

“I didn’t get everything done” is a standard lament for academics come late August, bemoaning some combination of the cult of productivity, human limitations, and the difficulties of researching during the school year. I am no exception. I set an ambitious schedule for reading scholarship beyond my immediate research, but only managed to read a handful of books and articles, and a couple of books on teaching.

There are a couple of explanations for this failure. One is that the summer quickly became very busy, with multiple family trips that had less down-time than anticipated, meaning that there was neither opportunity for reading nor for a deep recharge of my batteries. Another was that I taught an intensive summer World History course in June, so much of my spare reading went toward preparing for class. A third was that seemingly every spare moment around these time commitments was sucked up by working on revising my dissertation as a book. My goal for that was to have it under review by the start of class, but I missed that deadline, too. At least I am in a position to meet my revised goal of August 31 for that one…

ΔΔΔ

There has been a movement in recent years to normalize failure, particularly in academia, leading to people sharing their failures on Twitter over the last week. I mentioned there that I respect the movement, and appreciate the baseball analogy where if you’re a batter and only “fail” (make an out) at the plate six out of every ten times, you belong in the hall of fame. (There are obviously other statistics from baseball that could make that more or less extreme. If you’re a pitcher and batters swing and miss just 20% of the time, you’re incredible, but if that is the percentage of the time you throw strikes, then you probably quit playing in little league.) I respect the impulse to normalize failure because it is inevitably going to happen, regardless of how generous and kind the academy becomes. Everyone is going to experience article/grant/abstract/job/proposal rejections for a host of reasons. Sometimes those reasons are good (the project needs more work), sometimes they are petty, and a lot of the time is a simple numbers game that has almost nothing to do with what was proposed.

My shadow CV includes all of these things, including four article rejections, two more revise and resubmits that were later accepted, at least seven paper abstracts rejected that I can think of off hand, too many funding applications for fellowships and travel grants to count them all. And I am only a little more than a year removed from graduating with my PhD.

At the same time, I found the push to normalize, share, and celebrate failure on social media hard to handle. The main reason is that while failure is normal in the academy, and rejections can be handled deftly with an eye toward improving the project for the next time around, it is also a sign of privilege to be able to reflect on this Shadow CV. It is coming from someone still “in the game”, as it were, and I heard with every round of shares “this is what you *should* have been applying for.” As in, your failures themselves are inadequate because the “stars” fail bigger and better.

Then pair this with the part I left out of my Shadow CV that are the all jobs I’ve applied to without making the long list. The Shadow CV is meant to normalize failure so that people can better overcome the natural fear of it and thereby reduce anxiety, but when mixed with too few academic jobs to go around and the sheer amount of time that applying for them takes, it just exacerbated mine.

ΔΔΔ

I’m looking forward to teaching both of my classes this semester. One I am teaching my own syllabus for the second time, the other I am teaching as the sole instructor for the first time. I had the chance to teach on my own a little bit during graduate school, but this is my second year of continuously teaching my own courses and reading up on pedagogy, so I am now to synthesize some principles for my classroom.

First Principle: Learning, not grades. I do not care about grades beyond making sure that I have created a reasonable and achievable grade scale for the class. My goal as a teacher is to help students develop practical skills such as writing and the ability to understand the world through critical analysis and synthesizing information. Toward that end, I believe that many common assessment tools that are built for scale are next to useless in actually assessing learning. I design my classes around assignments that require students to develop arguments through writing and that build on each other so that students can show improvement in tasks that are not easy.

Second Principle: Empathy. Students are adults who have a larger number of demands on them than even I did when entering school fifteen years ago. I aspire to treat them like adults with responsibilities, just one of which is my class. College is “the real world” where students are on their own for the first time, and I want to be a mentor/coach/guide. This means having empathy, and encouraging them to take ownership of their education by talking with me when they have a conflict or need help.

Third Principle: Engagement. “Meaningful learning experiences” is a hot topic, though my mother assures me that this has been the key phrase for many decades now. Every class is going to be selective in the material it covers, so I see my job being to give students the tools to learn more and to pique their curiosity to want to do so. This means developing activities and assignments that require engagement, through games, debates, and projects where students take ownership of the material. This has not been the easiest task for me as someone who found history books thrilling in high school, but something that I am committed to improving in my own teaching.

There are others, but these are my first three.

ΔΔΔ

Without further ado, let the semester begin!

Small Teaching – James Lang

Small Teaching is another book that people recommended to me earlier this year when I was looking for resources on how to improve my teaching. Previously I read Jay Howard’s Discussion in the College Classroom and Mark Carnes’ Minds on Fire.

Let me start by airing a beef with James Lang. Small Teaching derives its name from the baseball philosophy “small ball,” which basically says that you don’t need to hit a lot of home runs to win games if you take small actions (singles, not striking out, good base-running) that manufacture runs. These are the baseball fundamentals every coach tells their youth team when they don’t have the same raw strength, and Small Teaching opens with the story of how the Kansas City Royals recently had a two-year run of success by employing small ball.

The Royals make for a good story, and the team and national media certainly gave credit to small ball, but Lang’s version of the narrative underplays how much of the Royal’s success either predicted the direction baseball would go (a light’s out bullpen) or zigged while other teams zagged (they struck out far fewer than any team in the league both years). In other words: small ball helped, but it didn’t tell the whole story.

In fact, this is an apt metaphor for Small Teaching.

Small Teaching is a book born from Lang’s years of giving pedagogy workshops, with the stated purpose of providing brief classroom activities, one-time interventions, and small modifications to course design that a) require minimal preparation or grading and b) improve the classroom experience. Lang’s intent is to make the book simultaneously worthy of reading in full and of keeping around as a reference work.

Spread across three sections, eight of the nine chapters are organized in the same basic structure. First Lang provides the theoretical and scientific bases for the chapter; then he offers models from his own classroom experiences and those of others; finally he concludes with the general principles that synthesize the theories and models.

There are a lot of good ideas in Small Teaching, including studies that confirm what I’ve observed in the classroom (e.g. the inefficiencies in a lot of assessment methods that are disconnected from both course goals and previous assignments) and others that I employ from years of tutoring that I hadn’t considered bringing to the classroom (the value of predicting and self-explaining for getting students to the “A-Ha moment”.) I was particularly taken by the first chapter on retrieving, which argues that while long-term memory is effectively unlimited, the ability to retrieve that information that improves with practice, and the chapters on motivating and growing (7 and 8), which focus on treating students as human beings who need to be stimulated and encouraged. The research Lang cites in these sections points to some of these issues being outside the hands of the professor, but there are still compelling reasons to not compound the problems.

I learned something in every chapter, whether about the science of learning (which is in the subtitle) or an idea, and frequently found myself jotting down the quick tips for later reference. Lang says that he is all for big changes like those Carnes proposed in Minds on Fire, but is more interested in easy but practical solutions. Like with small ball, the idea here is to maximize the resources at the disposal rather than calling for radical change. It is in this vein that chapter 9 (Expanding) breaks the mold by offering ways to transcend small changes and lists additional resources, suggesting that people commit to reading one new pedagogy book per year and one article per week from one of the suggested sites. Overall, the combination of practical recommendations with evidence from studies that demonstrate why these suggestions are beneficial made it a compelling read.

In sum: The greatest sign of this book’s success is the disconnect between what I thought while reading it and my notes. While I was reading Small Teaching the suggestions seemed profound; looking over my notes I found myself wondering why I didn’t think of these things earlier. Small Teaching is not a straightforward “how-to” book, but was an immensely useful to think with now that I am starting to put together my course schedules for the fall semester.

Minds on Fire – Mark C. Carnes

Earlier this year I crowd-sourced a list of teaching materials. Now that the fall semester is imminent, I am finally getting a chance to sit down with the list again in order to prepare for my courses.

The subtitle of Minds on Fire is its mission statement: “how role-immersion games transform college.” The book itself is a manifesto for Reacting to the Past, serving to defend and justify the games developed by the consortium.

Carnes’ core contention in Minds on Fire, and the underlying principal behind Reacting to the Past, is that students are engaged in “subversive world[s] of play” that range from video games to Zombies v. Humans to fraternity events. On the other end of the spectrum “all classes are kind of boring.” The solution, Carnes argues, is to harness the subversive worlds of play toward academic ends; that is, give students competitions and games that tap into their natural inclination for this subversive behavior and get them to do more work without thinking about it as work. Teachers facilitate the games, but then step back and empower the students to take the reins.

After setting out these principals, Carnes dedicates much of the book to laying out the advantages and countering the criticisms of using games in the classroom. There are chapters on how Reacting games teach morality and leadership and spontaneously produces community, things which are often touted as the purpose of a humanistic education or baked into college mission statements. Another section rejects the positivist contention that the past is a fixed stream and that opening the possibility of changing the past undermines history education. In each instance, the philosophical and pedagogical ideas are buttressed by excerpts from interviews with students who went through Reacting courses.

Minds on Fire is a convincing read, though I should say that I went in predisposed to think that as someone who has always balanced a fascination with history books with hours of subversive play. Carnes acknowledges, but also skims past, some courses are not going to be suitable for Reacting games and that not every Reacting exercise will be a raucous success. Nor is there much acknowledgement that Reacting is a radical proposal that seeks to achieve a fairly standard aim: significant learning experiences. Reacting classes, by not seeming like school work, give students ownership over their education and “trick” them into having experiences that cannot be faked or cheated.

There are other means to this same end, but there are also numerous classes where Reacting is a particularly effective way to grapple with issues, and I think it is no coincidence that some of the success stories came from Freshman Seminar or great ideas sorts of classes. I also think that long-running games could be particularly successful in discussion sections as a complement to lectures.

In sum: there were times that this book was too much of a manifesto, but while not every course needs to be a Reacting game, but every course can take lessons from Minds on Fire.

AP World History (Ancient)

The College Board received push back a couple of weeks ago when it announced changes to the AP World History curriculum, making the course begin in 1450. Critics online gnashed their teeth about a number of things, raising the legitimate concern that this would further marginalize the pre-modern world and that the chosen date, such that it meant anything, would default to a Euro-centric world view. The board responded this week by announcing that the new date is 1200, not 1450. Critics gnashed their teeth, albeit also in befuddlement at the seemingly arbitrary date.

(For what it is worth, the College Board’s stated explanation for the date, that it will allow “a study of the civilizations in Africa, the Americas, and Asia,” does check out. Now Genghis Khan, Mansa Musa, and the rise of the Aztecs all fall within the range. 1200 starts the course in media res, but that was inevitable when you put a start-date on a course.)

Reading the College Board’s announcement about the changes, I am of two minds. First, I am sympathetic when they say, based on the feedback from teachers that the current model is unsustainable because they are trying to do too much.

The current AP World History course and exam attempt to cover 10,000 years of human history—from the Paleolithic Era to the present. In contrast, colleges manage the unique breadth of world history by spreading the content across multiple courses.

The announcement is a little misleading when it says what college courses do and do not do—I did have a World History from the beginning of time to 1960 course in college—but, from the other side of the table, these broad strokes courses are incredibly hard to teach. Even “just” teaching a world history or Western Civilization before 1500 is covering a laughably enormous swathe of time, which is also the reason I have no pity for the US history professors who complain that there needs to be another mitotic division of their survey sequence, taking it from two to three courses. It has, however, been my position for a while that if I were made Grand Poobah of History Curriculum, I would invert the current paradigm and only teach the survey courses after students had been exposed to historical methods in specialized courses. The idea here is that by going from the specific to the general rather than the reverse, the students are better prepared to appreciate historiographical arguments and big themes.

The problem with this approach is that it is not scale-able as part of a standard test scheme meant to grant college credit. The situation the College Board finds itself in is quite the bind. It needs a single survey course to stand as a substitute for a college course because fragmenting the courses would neuter its ability to be a standard test and therefore undermine credibility. At the same time, the enormity of the course makes it difficult to teach the material in such a way that prepares the students to succeed at the work asked of them on the test. I don’t love this incentive structure with regard to student testing, but given the incentive structure in college courses I at least understand it. Moreover, in looking at old sample questions, the test itself isn’t bad in terms of the skills it is designed to measure. But it is also a lot. Obviously something needs to change.

This does not, however, mean that I agree with the changes the College Board made. My main issue is the decision to prioritize modernity, particularly because the AP European History, which starts in 1450, has already done the same thing. At this point I could start declaiming, ridiculing the absurdity of teaching a world history course that takes for granted the miracle that is agriculture, that conveniently forgets that Rome laid the groundwork for modern Europe, or that doesn’t bother trying to understand the founding and development of *any* of the world’s major religions. Yes, there are a couple of things that happened after 1200 (or 1400) that are important, but these are all built on precedents and developments that came before.

The college board has put out that it is open to creating a second AP World History (Ancient) course, which, despite the awkwardness of the name (congratulations, Richard I of England, you’re an ancient king!), is a fine ambition. But here is the thing: I am skeptical of how quickly yet another AP course can be developed and instituted, let alone how widely it would be picked up. Things have changed since I was in high school, but when I was coming up I didn’t even have access to one AP World History course, let alone two. I got my start with classes on all things ancient through my Latin teacher. Now I am fortunate enough to teach ancient history to college students and am consistently impressed with how many students from all sorts of disciplines come out to take my classes.

Maybe I am wrong and this interest will prompt dedicated high school teachers to make the second course come to fruition, but in the meantime I cannot help but think of this as a missed opportunity on the part of the College Board. There had to be a change to the AP World History course, but instead of even temporarily erasing antiquity, it should have kept the earlier portions, perhaps as an AP World History (Foundations), and developed (Modern) as the secondary offering.

Discussion in the College Classroom – Jay Howard

A couple weeks ago I crowd-sourced a reading list on teaching with the aim of getting better at my job. As much as I trust the people who contributed to the list, it wouldn’t be worth much if I didn’t then start reading; I have decided to write up some of my notes and observations, posting them here and on Twitter.

First up is Jay Howard’s Discussion in the College Classroom.

The short recap is that I found this book useful:

https://platform.twitter.com/widgets.js

Howard starts by making the case for the value of discussion in the classroom, with the caveat that not all conversation is created equally and that the job of the instructor is to lead students past superficial observation toward deeper meaning. His advice is divided between two interconnected categories: best practices for communication in the classroom and structuring courses to encourage and reward active participation.

Both categories are designed to overcome the prevailing social norm in the college classroom, “Civil Attention”—defined as the appearance of attention regardless of how tuned in the student actually is—a norm that is reinforced by over-reliance on lecture and a reluctance to ask direct questions (which Howard notes may be mistaken for hostility by the professor).

In order to change these norms, Howard calls for instructors to start on the first day of class by communicating and expectation of communication and what participation entails. The latter part will vary based on class, but it is important to convey what counts and how to avoid misunderstanding between a professor who wants students to talk all the time and students who believe they “participated” by doing the reading and showing up.

Howard addresses a number of issues, from how to avoid the trap where one or two students take on the responsibility for participation, grading discussion, and how to run an online discussion board, but some general principles stand out:

  • Large class size inhibits conversation, and it is often useful to subdivide a class down to groups of six or eight, even in large lectures, and encouraging students to exchange information and ideas.
  • It is easy to forget that students are not subject matter experts who have been thinking about issues for year. Give students time to formulate answers to difficult questions.
  • Ask good questions. Avoid factual questions or questions with yes or no answers, but ask opinion questions that can be supported through the text
  • Positively reinforce behavior your want to see by acknowledging student contributions, questions, and risks.
  • Give students peer to peer obligations that prepare them to engage in discussion.
  • Engage with students before and/or beyond the classroom, such as requiring a two minute visit to office hours to say hi. This gets the students comfortable with engaging with the instructor.
  • Above all: be aware of what is going on with the class. This includes body language and what the syllabus says, the physical distance between instructor and student, and whether the course structure is facilitating or erecting barriers to student participation.

Howard’s advice is based on a combination of extensive personal experience and research studies on student participation, but he is careful to note that not only will these suggestions not be a one-size-fits-all solution, but also that what works with one set of students won’t necessarily work with a different set of students the next time the same course is offered, let alone with a different instructor. Nor does he dismiss the utility of a content-based lecture format, all the while offering ways to blend the two formats to maximize student engagement.

There are too many specific suggestions even to begin listing them, but they make this book worth reading. There may be a point of diminishing returns in reading books on pedagogy (unless that is your field of study specifically), but Discussion in the College Classroom is a useful place to start.

Pedagogy in the Humanities – a reading list

On the list of things I don’t really have time for, but want to do anyway, is spend more time reading about the mechanics and craft of teaching. I am particularly interested in issues of course development and planning, active learning, student engagement, and assessment. I sent out a tweet for book suggestions and in the first couple hours it was posted more people boosted the signal through retweets than suggested bibliography, though suggestions did begin to trickle in.

It has been about twenty four hours since I sent out that request; here is my reading list so far:

  • Ken Bain, What The Best College Teachers Do (Harvard 2004)
  • Peter Brown at al., Make It Stick (Harvard 2014)
  • James M. Lang, Small Teaching (Jossey-Bass 2016)
  • Marc C. Carnes, Minds on Fire (Harvard 2014)
  • Jay Howard, Discussion in the College Classroom (Jossey-Bass 2015)
  • L. Dee Fink Creating Significant Learning Experiences (Jossey-Bass 2013)
  • Susan Ambrose, How Learning Works (Jossey-Bass 2010)
  • bell hooks, Teaching to Transgress (Routledge 1994)
  • Nancy Sorkin Rabinowitz and Fiona McHardy (edd.), From Abortion to Pederasty (OSU UP 2015)
  • John Gruber-Miller (ed.), When Dead Tongues Speak (Oxford 2006)

Jay Dolmage, Universal Design: Places to Start, Disability Studies Quarterly 35 (2015)

BU Proseminar in Classical Pedagogy, resources curated by Dr. Hannah Čulík-Baird.

This list will be updated. Additional suggestions are welcome in the comments.

College and Industry

LMS tech support, freelance construction contractor, camp counselor, grocery store cashier/stocker, quick service restaurant manager, QSR assistant manager, history/classics/political science tutor, adjunct instructor, teaching assistant, research assistant/editorial work, furniture mover, visiting assistant professor.

I think that is every job I’ve held since I was 18. Going back further, I could add data entry, housekeeping at a resort, and some other odds and ends. This is something that some people on academic Twitter have been posting in response to this Times Higher Education opinion piece. In short, the author declares that “Too many academics have spent most, if not all, their professional lives within universities,” and therefore:

  1. all potential professors should be required to undergo a year-long internship before they begin teaching.
  2. And all academics should be required to return to work in industry every three to five years as part of their professional development and career advancement.

My Twitter feed was abuzz with outrage at this article, I think for good reason. Scholars in the humanities reacted to the article online responded by pointing to their work experience and then, in so many words, asking what industry the author propose they take their rotations in. That said, I wanted to unpack some assumptions about higher education, because I also don’t disagree with the top level idea: that it is necessary to find ways to support and improve college education.

First, there are a set of assumptions in contemporary discourse about college, if not the article explicitly, about how being a professor is not “real work,” which encompasses several broad categories that all come back to the cult of amateurism surrounding college. I am obviously poaching my core idea here from the issue of whether college athletes ought to receive greater compensation for their labor, but this cult extends beyond NCAA rules about amateurism. There is a perpetual cycle of hand-wringing about how college students are spoiled and insulated from the “real world” that they will face after graduation, whether in the service of lamenting “kids these days” or the failures of higher education. And if college is not the “real world” for students who are set adrift in their “Odyssey Years” (as David Brooks called it in 2007), then it cannot be the “real world” for their professors, either.

About those professors. There is a persistent myth of overpaid and unfireable professors who are detached from the goings on of that mythical real world. Compounding this problem is that many, if not most, people with advanced degrees have made sacrifices for their field by spending years on meager stipends in graduate school. A common explanation for this is that their research amounts to a passion project. Even glossing over the fact that most professors, myself included, are contingent employees with limited benefits, most tenured professors are not overpaid, either for their level of education or their time. Professors are expected to be experts in their field, prepare, teach, and grade for classes, mentor students, perform world-class research in their field, develop outreach programs, and serve on institutional and professional committees, just as a baseline.

And yet there is also a bias that underlies this op-ed, namely that there is a distinction between “doers” and “teachers.” In the 2000 film “Finding Forrester” featuring Busta Rhymes, a gifted young writer (Rob Brown) is persecuted by his teacher (F. Murray Abraham) and is accused a plagiarizing the work of William Forrester (Sean Connery) until it is revealed that the teacher is a bigger failure as a writer. The argument, then, is that teachers are people who couldn’t hack it in their particular field. (The film makes no concession to the fact that most authors have a day job that may or may not involve writing.)

The author doesn’t go so far as to call professors failures, but she strongly suggests that there there is industry on the one hand and higher education on the other. “Professor” should not be a career, but a position that needs to be cycled through because it results in the professors being out of the loop. This model might be viable for some positions in some fields that rely on industry connections, but, at the same time, universities and colleges often work in tandem with industry in those fields already, with the schools providing cheap labor and resources. Where the model doesn’t work at all is in the humanities, where so much of the research is performed by scholars in higher education. In these cases, mandatory years off not only don’t improve the student education, but actively hurt it.

Higher education is an industry. It employs all sorts of people from maintenance staff to food service professionals to fundraisers and secretaries, but there are two groups without which it cannot exist: students and professors. Work as a professor is not manual labor and has its own schedule, but it is a form of modern white-collar employment.

Of course, the valorization of “real work” cuts both ways. There are plenty of examples of academics who simultaneously look down upon and feel nostalgia for labor that they would never do.

While we’re here, many students are employed, either by the university in the vicinity, and juggle those responsibilities alongside their coursework and professional development opportunities. College has its own set of rules and expectations, but thinking about it as something other than “the real world” is a lazy trope long past its expiration date.

Finally, a word about the point of education. The author concedes that “higher education is not all about career advancement,” but her basic thrust is nevertheless that disrupting the status quo for professors is the only way to ensure students “find their professional niche, alongside the robots.” Humanities and a liberal arts education that teach citizenship are given barely a sentence in the conclusion, without any recognition that these are disciplines that teach the sorts of analytical thinking and communication skills that perhaps most correlate to coexistence with an increasingly automated economy.

By all means, increase resources and opportunities for pedagogical training alongside research support, and find ways to ensure professors stay abreast of the latest developments in their field. As for the internship, there are already years of graduate school, so finding a way to work more pedagogical training into the curriculum ought to be doable. We should not excuse those professors who are oblivious to the difficulties facing students, but the rest of this proposal is a one-size-fits-all solution that frames the virtues of the liberal arts as incidental and therein lies the bigger problem.

Preparing for class and my undergraduate experience

The process of preparing for class makes me try to remember about my undergraduate courses. In terms of specifics, the answer is not much. Obviously I absorbed a good deal of content that I am now able to speak with varying levels of confidence about, but much less stands out about the actual classes.

Take, for instance, the equivalent of the course that I am now teaching—a survey of Greek history. I remember my professor’s opening spiel about the etymology of history and how it comes from a root that has to do with judgement, I remember bantering with a friend of mine who also went on to get a PhD in ancient history, and I remember one of the other students making a diorama from wax sculptures after taking the wax from individually wrapped cheese “cuties.” And some of those memories could easily be from other classes with this professor.

Most of all, though, I remember loving the class (and other classes like it) because the professor gave us room to explore long sections of ancient sources, even to the extent of seeming disconnected and disorganized. In fact, I remember having an argument with a fellow student in a class in another department altogether because this student hated the disorganization, feeling that it meant that she wasn’t learning anything. I vehemently disagreed at the time, which was something of a running theme in a course that had us working in a group for most of the semester. Believe it or not, we actually worked pretty well as a team.

Before laying blame on the professor, though, reflection shows this limitation of my memory is true even in courses with amazing lecturers. For instance, I have clearer memories about my favorite college lecturer declaring that blue exam booklets were the ideal form for writing lectures in, the fact that the Anatolian peninsula is, north to south, the international measurement unit “one Kansas,” and his apologies for the boring but necessary excursuses on medieval agriculture. Or that in the last week of class he never failed to take a photograph with a disposable camera and that I invariably left class every day with an aching hand. That pain and some later sweat ensures that I can go back to my notes if necessary, but, once again, I don’t remember much at any given moment.

I could go on, but there is one particular exception: language classes. The memories are almost certainly just as flawed, but I remember the act of being there, the feel and the look of the book chapters, and all of the things Homer taught to his brother. More to the point, my memories of language courses are clearer regardless of whether I liked or disliked the teaching styles of the professors. I don’t know why, exactly—maybe I found languages more difficult and so the classes left a deeper impression or the way that I learned the languages was tied to the classroom in a way that history never war—but the division in my memories is real.

Obviously I learned facts from these courses that, ten years later, have been baked into the collection of knowledge tucked into the dusty corners of my mind or else that I have forgotten. I also learned note-taking skills, research habits, a critical eye for source criticism, and something of writing. (Less by way of common sense, however, even if one of the professors mentioned above did try to warn me off of graduate school.)

I think about all of this when I am preparing for my own class. My class is just too large to toss the textbook in favor of embracing the glorious confusion of reading sources together, and I feel some responsibility cover a certain number of topics in a survey of Greek history. I tend, therefore, to err on the side of structured lectures with a powerpoint presentation modeled on the US history survey courses that form the large portion of the teaching styles I have seen in recent years. There is only so much that can be covered, so, in this sense, I look to give students a taste along with some tools to learn more.

At the same time, though, I think back to being encouraged to engage in forms of source analysis and informal, seminar-style debate with great fondness. Unstructured though those may have been, they also reflected active learning at its finest. As much as this form of class worked for me, ironically, it often takes a leap of faith for me to try it from the other side of the table (so to speak). I will probably never abandon lectures altogether in a class like this where there are details that I hope will encourage students to go out and learn more, but at the same time I am always looking for new activities where the students can grapple with the primary material together or on their own because, more than the lectures, that is often what I remember being most useful from my undergraduate experience. This experience didn’t do me any favors in terms of downloading and debating historiography for graduate school, but in the more universal tasks of evaluating how a source is presenting the world and challenging its prevailing biases, it is absolutely essential.

(Re)visions and Assignments

Every student paper should be revised. More than once. In an ideal world, that is; in the real world there are problems of scale and deadlines.

Periodically I receive an request from a student to revise a paper in return for extra credit. In the past when teaching in surveys of American history with up to a hundred students at a time, I feel obliged to reject these requests. I would love it for students to revise their papers, but extra credit is not something I can extend to just one student in good conscience and there isn’t enough time in the semester to let every student do this unless it is built into the course. On the one hand, I feel bad about rejecting some of these requests since I am acutely aware of the challenges facing the current generation of college students; on the other hand, though, the requests are framed in terms of getting a higher grade, not in terms of education.

This disparity comes in part from the nature of these assignments. I suspect that nobody has looked at a survey-level essay on the changing conceptions of race in America from 1865 to 1925 as an opportunity to write a brilliant and incisive critique of race in America. Even if the author has a fiery passion for the topic, the prompt and supporting materials don’t lend themselves to it. The disparity also speaks volumes about how courses like this one are treated. They are a grade, not an opportunity to learn about American history or learn practical skills such as writing or rhetoric.

Returning to the nature of the assignments, one-off submission that return marked and assigned a grade lend themselves to thinking about the assignment in terms of the grade instead of in terms of process. I understand the counter argument that history classes are for teaching history and not for teaching writing, particularly in these large survey courses. And yet, history is fundamentally discursive.

This fashioning of history, along with how we remember history, is going to be a point of emphasis this fall when I teach a survey of archaic and classical Greek history. I am going to do this not only because of the recent and not-so-recent appropriations of antiquity for political agendas, but also because I hope that pushing people to think about these issues in a Greek context will make it possible to think about in our contemporary context.

I am also planning some opportunities for my students to revise their work, made possible in large part because of a smaller class size. As of right now the idea is to give an option for students to revise at least one of their assignments for a higher grade, as well as making that type of assignment recur once more later in the semester in order to maximize the benefit for the student. The plan is to have revisions take place in two phases, with the first being that they come meet with me to discuss the assignment, before then making revisions based on both the written comments and conversation. My hope is that in addition to setting assignments that push the students to write a decent amount, adding this (optional) revision stage will meet the students halfway toward thinking about assignments qua grades. That is, maximize the students’ opportunity to earn a higher grade while underscoring that writing (and thinking) is a process that doesn’t happen simply by vomiting words onto a page.