With apologies to Bill Caraher, who offers regular reflections on teaching under this title, it was appropriate. Today I read two pieces—including one by Bill—about teaching worth engaging with.
The first came in Keith Law‘s newsletter where he talked about his experiences teaching one course on communication. Klaw is one of my favorite sports-writers and his blog is one of the reasons I write about books in this space. He described the course this way:
It’s turned out to be one of the hardest, most anxiety-inducing things I’ve ever chosen to do.
Teaching is not rocket science, but that doesn’t make it easy. It takes continual adjustment and the ability to adjust on the fly when the best laid plans fail to survive first contact. This reality is something that is hard to appreciate when your only experience with teaching is as a student taking classes that you may or may not have enjoyed.
I am all-too familiar with the fear that Keith Law explained is haunting him:
I’m constantly plagued by the fear that I’m not doing enough for the students – that maybe what I’m teaching them isn’t useful enough, or that I’m just not giving them the information or insight they’ll need.
Any time a class goes well I am reminded of how I posted a sign that I read about on the internet on the inside of the door of my first post-PhD office. It read:
You are only as good a teacher as your next class.
No class is going to be perfect, there are only so many hours in the day, and only so many of those are spent in the classroom. All you can do is reflect, adjust, and move forward. I suspect that KLaw, someone who is a reflective person and who I think came into the experience with a healthy appreciation of teachers, is doing fine. Nevertheless, I couldn’t help but laugh knowingly when I read the entry.
For college teachers who want to improve, you could do a lot worse than read Bill Caraher’s regular Teaching Thursday column.
Today’s entry was the latest in a long series of posts that document how he has adapted his World Civilizations course over the years.
World Civ can be a hard course to teach and a frustrating one to take. In the first “half,” it is a course that tackles several thousand years of human civilization spanning the entire globe and taught as a general education course to first-years and non-majors. There is no chance at comprehensiveness and I find that approaching it through a series of themes and broad connections easily become abstract to the point of uselessness except to students who are already passingly familiar with the specific examples that illustrate the theme.
(I often think that World History would be a more valuable course at the senior level, but this is not the system we have and I can see an argument that such a radical inversion would hurt enrollment.)
In this series , Bill has talked about his solution to the problems of a World Civ class, namely flipping the class and challenging the students to produce material together in class, as well as the reasons he made the change. I wonder a little bit about how this assignment would work in a class that meets multiple times per week rather than in long blocks, but I also recognize wisdom in how he has developed the class. It can be a surreal feeling to walk around a classroom where the students are working in groups and I am just eavesdropping. Bill calls it “boring;” I can’t disagree. But if the students are engaged with the historical material while I am bored, doesn’t that mean that I have done my job?
I am not sure that I will — or even should — go quite as far as Bill has in moving the class material to class time, but our courses also have different demographic contexts. And yet, in my first time teaching my World Civ class in its current institution and current iteration, I am finding myself thinking about almost all of the same issues. There is only so much that I can change mid-stream, but I have a lot to consider for next time. In addition to the mechanics of a World Civ course, Bill’s post engages with outcomes and ungrading more generally and both are worth considering. Check it out.
Last week I attended my first in-person conference since January 2020 when I attended the AIA-SCS annual meeting in Washington DC.
It was a surreal experience.
On Wednesday afternoon after I finished teaching for the day, I hopped in the car and drove to Eau Claire, Wisconsin to attend a regional history conference where I would be chairing one panel and presenting on another. Both the venue and the conference acknowledged the ongoing pandemic with signs requesting or requiring masks (depending on where one was at a given time) and seats conspicuously distributed about six feet apart around the presentation spaces. Most people abided by the mask guidelines, as far as I could tell, but this only served to make me more frustrated with those who weren’t whether or not they had the pretense of food or drink nearby.
How much I like in-person conferences under normal circumstances depends a lot on my headspace. I get quite nervous about public speaking and go through frequent bouts of imposter syndrome, but I also find these events invigorating. For every time I have stood awkwardly at a reception, I have made two friends by putting aside my hangups and just gotten into a conversation. After all, the attendees are (generally) there to make new contacts. Likewise, I am now at a place in my career where I can pull aside graduate students after a talk to give them positive reinforcement and suggestions much as was given to me a decade ago.
I have loved the accessibility that accompanied the pivot online during the pandemic, but there is a tradeoff. I have attended more conference than usual, as well as workshops hosted out of Winnipeg, Rio de Janeiro, Oxford, Chicago, Oregon, and Athens (to name just a few), but I have not found the virtual experience nearly as conducive to networking, at least as someone who was not already connected to the host networks.
In this respect, I found myself glad to be back at a brick-and-mortar conference where there could be fortuitous encounters in line at the coffee shop or where I could grab dinner with conference attendees (on a patio).
By the same token, the decision to make this an in-person conference led to a significant amount of chaos. Many people—myself included—had applied to the conference with the understanding that it would be held virtually. When this turned out not to be the case, I was fortunately still able to attend, but many attendees required virtual accommodations. To their credit, the conference organizers did provide a Zoom option for these attendees, but we were still working out how this would work the day before the conference started. The format made it easier for people to present more easily than to watch papers online, but when it worked things went smoothly enough. However, this time crunch put the onus on panel chairs (rather than tech volunteers) to manage the Zoom feed, so when it went poorly things went haywire, whether because the organizer and panel chair couldn’t reach a presenter (who likely sent a pre-recorded talk that went unnoticed) or because a nervous presenter closed out a Zoom room and no-one noticed until it was too late to bring the attendees back.
The reality is that we are still in the middle of an ongoing public health crisis. I was willing take the risks of exposure because I am fully vaccinated (still <6 months since my second dose) and could afford to take many precautions in how I travelled. Still, if we are going back to meatspace in-person conferences, I think that they will have to include a hybrid or virtual option for the foreseeable future.
If anything, this experience reminded me that saying there will be a virtual option is one thing, but executing it is something else entirely. Suffice it to say that I am even more pleased that the AIA-SCS has been planning a virtual event for months already even though the conference is in January.
I enjoy the ritual of setting aside my daily routine for time spent engaging with colleagues. This time I just also spent this conference thinking about how this was all premature. Wishful thinking won’t make the pandemic go away. I understand the desire to win back some of what has been lost over the past year and a half, particularly if most attendees are already vaccinated, but it is too soon to return to the pre-pandemic status quo, if that should even be the goal.
Each of the past three years I have written a post celebrating the start of the fall semester with quick hits on various topics that I’m mulling over going into the new academic year. You can find the earlier posts in the archive: 2020, 2019, 2018.
One recent addition to my podcast lineup is Self-Compassionate Professor hosted by Dr. Danielle De La Mare. The podcast as a whole is aimed a little bit over my head (I’m neither a mid-career nor a recovering academic yet) and I am somewhat ambivalent about the emphasis on entrepreneurialism, which often leans toward career coaching. But I like listening to people talk about how they navigated their academic careers, if only because that is something I’m just starting to do.
Dr. Simula talked at length about pandemic burnout and suggested that professors identify what tasks need to be done at 100%, which ones can be done at 70%, and which ones can be let go. I was already feeling exhaustion tug at the corners of my awareness when I listened to this episode, but I also know that my students are living through the same world events at an even more tumultuous time in their lives.
A useful reminder going into this semester.
I have spent a lot of time going back and forth about what I want to do for pandemic contingencies this semester and how much I need to front-load that information in my syllabuses. I am working at a university that made it through last year without interruption, is requiring masks when indoors, and has a good rate of vaccination, but the Delta variant is all-but guaranteed to take some students out of class this semester. Right now I am encouraging students who are experiencing symptoms or who have tested positive to remain out of class and will offer alternate assessments for them to make up what they have missed.
Zoom is a wonderful tool for some purposes and I am looking to offer virtual options for office hours, but trying to teach students in the classroom and on Zoom simultaneously was so exhausting last year that I would like to avoid a repeat of that experience if at all possible.
On a recent road trip I found myself working through the archive of the Fake Doctors, Real Friends podcast. In one episode, Zach Braff dropped a line attributed to the eminent screenwriter Lawrence Kazdan:
Being a writer is like having homework every night for the rest of your life.
This is a catchy aphorism, though I couldn’t find a source in a few minutes of poking around online. I think its spirit is correct — writing often gets done on top of other employment because writing alone won’t pay the bills, for one thing, but writing also involves hours spent thinking about writing and deadlines that superficially resemble school. But I also think that comparing writing to homework trivializes writing in some ways. Writing is work, full stop. The fact that it takes place at home and sometimes off-hours doesn’t change that.
I am a little bit behind on my work right now, but I’m hoping that the structure of a semester will help motivate me to sit down at the computer and plug away at my projects. This is work that I want to do, I just don’t like the implication of calling it homework. Then again, what I really need to be more disciplined about is reading.
I am officially one week into my new position at Truman State University — a full-time non-tenure-track position teaching world and ancient history. The result is that today has a little bit more than the usual first-day jitters.
So far, the students are enthusiastic and I like my colleagues, but, selfishly, I am most looking forward to having all of my classes at one institution that is invested in me as a teacher. This is a welcome change after several years of hustling for classes at a bunch of different institutions all of which made their decisions on different timelines.
These are all good changes and I am thrilled to be here. I am just also still in a little bit of disbelief and for this year at least feeling a measure of survivor’s guilt.
I’m not as prepared for this semester as I would like to be, but I am looking forward to it quite a lot. For now, that will have to be enough.
You have just spent the last ten minutes doomscrolling through Twitter. Some of the posts made you laugh. Some made you anxious over the state of the world. Some made you insecure about what you are or are not doing. A couple made you think. Maybe you responded, but probably not. You might have clicked through a link, but, again, probably not. It is time to work. You close the Twitter app. Then, without so much as putting your phone down, you reflexively open the Twitter app and check out what is happening — if you’re anything like me, you didn’t even open another app in between.
Or maybe you went from the Twitter app on your phone to Twitter on a browser, or vice-versa.
Or, maybe, TikTok or Facebook are more your speed. Or maybe snapchat or a game. The specifics don’t matter because the end result is the same: people flit from one thing to another drawn like moths to a flame to advertisements, social media, and a host of other distractors carefully designed to harvest our attention.
This ubiquitous feature of modern life, naturally, leads to waves of hand-writing over the pace of life and how modern technology has entirely ruined the ability of people, but particularly young people, to focus for any length of time.
In an educational context, these fears has led to the question of how to best eliminate distractions from the classroom, whether through draconian technology bans or trying to convince students to treat class like a sanctuary where they should leave their concerns at the door for the duration. According to James Lang, however, these well-meaning impulses are asking the wrong questions. We can never eliminate distractions. Beyond the simple fact that our monkey minds are calibrated to look for distractions, it is too much to expect that students will be able to put out of mind a sick loved one, or a relationship problem, or a bodily pain, or any of an infinite variety of other concerns for a class that may or may not be all that important for them. If this wasn’t obvious before, it should be now given the ongoing COVID-19 pandemic.
That’s the bad news. The good news, as James Lang points out in the first chapter, is that latest round of laments for the prelapsarian days before distraction are strikingly myopic. That is, there was never a golden age when people were free from distraction and laments about its loss merely get updated to account for technology. In his posthumous novel “The City and the Mountains” (A Cidade e as Serras) from 1901, the Portuguese novelist José Maria de Eça de Queirós includes a dream sequence where the narrator is appalled by the frivolity of modern life:
“Leaning in His super-divine forehead which conceived the world, on the super-powerful han which created it—the Creator was reading and smiling. I dared, shivering with sacred horror, to peep over His radiant shoulder. The book was a popular edition, paper-covered. The Eternal was reading Voltaire in the new, three-franc, cheap edition, and smiling.”
Or one could look to the collection of quotes on the subject collected by Randall Munroe in XKCD:
In other words, to be distracted is to be human. Even as I write this, I am distracted by a kitten who doesn’t understand it is a problem for her to repeatedly leap onto my desk, chew on books, papers, and pens, and nuzzle my hands while I type. She is also fascinated by my fingers when I am touch-typing.
Lang’s thesis in Distracted is thus that we should not pursue the quixotic aim of eliminating distraction, but that we should be leaning in to strategies that cultivate attention. Sometimes this requires temporarily eliminating distractions — when I am doing my academic writing, for instance, I set a length of time during which I turn off my email and won’t check social media —but, more frequently, the strategies involve finding ways to redirect and renew attention when it flags over the course of a class and a semester. Learning is hard work and if you’re anything like me your attention span dips precipitously when you’re tired. The same thing applies to students.
This thesis might be simplicity itself, but actually pulling it off in a classroom setting requires practice and attention.
Like his earlier book Small Teaching, Distracted is not prescriptive. Lang mentions several times that he is generally agnostic about a lot of teaching methods because good teaching can take many forms. What works for one teacher — or student — won’t necessarily work for another. Rather, he lays out current research into the science of attention and uses numerous examples of activities and practices to establish principles that any teacher can adapt to their class.
I concluded of Small Teaching that its simplicity was the greatest sign of its success. Distracted tackles thornier issues and Lang dedicates the entire third chapter (~35 pages) to the tech ban debate that couches his suggestions in the awareness that his own policies have changed quite dramatically over the years. This and other portions of the book take a more process-oriented approach that encourage the teacher to be conscientious of how the policies affect the classroom atmosphere.
Other portions of Distracted are more like Small Teaching. The book’s second part offers six “practices” of attention and how they can help draw students toward the material you have to offer. These range from the simple — cultivating a community through the use of names and modeling the behavior you want to see by leaving your phone in your office — to engaging student curiosity to techniques for focusing attention by switching between activities or with quick attention renewal devices in which he gave the example of a preacher asking an audience for an “amen” when they start to drift. Lang also makes the case that assessments are a critical component of attention because they work to direct students toward the material that you believe is important in the course. Sometimes this means crafting assessments with attention in mind since many students will never be more focused on your material than when writing a big test, but other times it involves no- or minimal-grading on repeated assignments that ask the students to connect what they’re learning in the class to life today. Students might find the practices unfamiliar at first, but with practice and attention on the part of the teacher they can pay dividends in the classroom.
Much of what Lang writes in Distracted echoes the direction I have been moving my courses over the past few years in terms of building community and keeping the classroom fresh, particularly on low energy days. It doesn’t always work, of course, but each of the chapters in Part 2 offers a wealth of ideas to help draw students back in. For this reason I fully expect that I will return to Distracted for inspiration and found that it was an ideal book to read while putting together my courses for the semester. In fact, I often would read something that inspired me to put down the book mid-chapter to modify language in a syllabus or tweak an assignment. It is possible to quibble with a small individual observation or policy or suggestion, and I did at times, but for every one where that happens two more will land home.
Distracted is not necessarily where I would start for a new teacher looking for tips on teaching (my current recommendation is David Gooblar’s The Missing Course), but it is both one of the two books I would suggest after that (along with Jay Howard’s Discussion in the College Classroom) and a book with a lot worth considering for even the most experienced teachers.
One of my most vivid memories from my middle school days involve my keyboarding class (or whatever it was called). Somewhere along the way, and probably in at least some small part in that class, I did learn how to touch-type, but I clashed with the teacher on a number of points. For one, we had to type from a script, but that sheet had to be kept so that we could not look at the monitor. Even now, at a time when I sometimes type with my eyes closed or while staring off into space, I prefer to be able to see the screen. For another, this teacher demanded that we had to include two spaces after periods.
This might have made sense with the specific program we used in the class, which may have not had proportional fonts, but she justified the demand by insisting that every business manual would require double spaces after periods. This is of course nonsense. Most manuals regard the convention as a relic of the typewriter-era, which is something my father, a printing and graphic design teacher, pointed out, before offering me the sage advice of doing what she asked for the class grade and then ignore it going forward.
Admittedly, this teacher was close to retirement and I could not have been an easy student since I’ve always had a hard time bending when the instructions ask me to do something that I know is wrong — this difficulty reared its head again in another context when I was a senior and the two principals called me into their office to question me about some award essay where I had asserted that Bill Clinton had been impeached (he was; I didn’t get the award). However, I thought about this keyboarding teacher again when I saw a teacher on Twitter give his policy about spaces after periods.
Setting aside that his assertion doesn’t even hold up according to MLA standards, I can’t imagine having a policy that is this punitive over something so small. I can understand why some professors want to be demanding when it comes to grammar and syntax since those are elements that can (sometimes) have a direct impact on the clarity of a student’s argument. By contrast, this is a severe penalty for a formatting error.
Since I am starting to get prepare for this fall semester, though, his (bad) policy has me thinking again about my own policies when it comes to written work. I have always followed two guiding principles:
I care about students developing as thinkers. Writing, as John Warner explains, is thinking. This means helping students develop as writers.
I’m somewhat ambivalent about grades because I think that they often warp incentives, but it is my obligation as an educator to give students the tools and opportunities to earn the grade that they want.
Toward these ends, the details of my assessments have evolved to reflect what I want students to take away from the class. Since writing is fundamentally iterative, for instance, I added optional revisions that allowed students to earn higher grades on their written work. The most recent versions of these assignments include a small portion of the grade dedicated to “Grammar, Syntax, and Style” in order to provide a measure of accountability, but an equal portion of the grade is wrapped up in a metacognitive reflection paper about the process of completing the assignment. The single biggest component of the grade comes from the argument, and every assignment guide for prompt-driven papers comes with this advice:
You are not expected to answer every part of the prompt since these are questions that you could write an entire book about. The best papers take ownership of the prompt in order to make an argument based on information that includes, but goes beyond, the material assigned for the class. Since there is no “right” answer, I will be looking to see how you approach the question and how you use the sources to defend your argument.
Although I have gradually added guidelines and suggestions on the assignment sheet, my assignments are, if anything, still too open ended. When it comes to citations, for instance, I have traditionally told students:
There is no assigned citation style guide, but you must cite all relevant information, following a citation style of your choice (I prefer Harvard or Chicago, personally, but you can follow MLA or APA). Please include a works cited page for all citations.
While I should be clearer about what information is relevant, this particular policy reflects my own ambivalence about citation styles (my personal house style is a slightly modified Harvard system) and a conviction that policing the format of a citation distracts students from content. And yet, my laissez faire attitude toward style might be equally problematic, both creating unintentional anxiety created by a lack of guidance and leading some students to not follow a style at all.
By contrast, I am reminded of a policy of one of my college professors: your citations must be in Chicago citation style and failure to do so will result in a penalty. I found this assignment frustrating at the time, but I see some wisdom in it now. The penalty sounds severe, but she laid out the expectations from the start, explaining that it wasn’t that Chicago was the best style but that Chicago was a style and part of the assignment was to turn in work that followed that style.
I am not sure that I want to go quite this far in my courses, if only because doing so would make grading papers even more like copy-editing than it already is and I’m not sure that that is in anyone’s interest. However, I am strongly considering choosing a “house” style guide (with a handout) that I can point students to as a default option. My thought here is that having a house style might provide some guardrails that remove the pressure of choosing “the right” style, thereby allowing students to focus again on the content.
At the same time, my inclination is to still allow students to follow other style guides if they so desire but ask that those who do reflect on the choice in the metacognitive portion of the assignment. The trick will be crafting a policy that provides flexibility and student agency while also putting in place limits that guide students to spend their energies on the parts of the assignment that matter most.
I like blogging for a lot of reasons. In part, I use this space as an outlet for all sorts of topics that I would not otherwise get to write about — book reviews, pop culture discussions, thinking out-loud about teaching or academia or random historical tidbits. It also encourages me to write a lot, which I firmly believe is how one learns to write well. I also like how, at least on a personal blog, it can be done quickly. My process involves writing a piece, a quick editing pass, and then hammering the publish button. Sometimes, if I think the issue might receive a lot of blowback I will ask a trusted reader for feedback first.
This is how I have published more than 536,000 words on this site, some of them excellent, some of them bad, most of them just okay.
I also like the ephemerality of blogging. Certain posts routinely get traffic — my review of the novel Basti is perennially popular among what I assume are Indian students who had to read it for school, for instance, and apparently people liked my review of The Fifth Season — but most posts get all of their traffic within the first week of going up unless I do something to promote them later and, even then, that tends to be much lower than the initial burst. (Even a week is generous; I’m lucky to get three days.) These are the same trends that lead to concern about the future of blogs, but I like using the space to think through issues with the reassurance that I am not writing a κτῆμα ἐς αἰεὶ, a possession for all time, as Thucydides characterizes his history, but rather something of the moment with a slightly longer residue.
I cringe when I read my earliest posts, which I actually imported from an earlier iteration of this blog. I have considered purging them altogether on more than one occasion. They are very different from what I write about now and I don’t agree with everything I wrote back then, though these tend to be issues of historical interpretation rather than moral stances. What stops me from purging the record is two things: those posts almost never receive visitors and if someone were to look at what I wrote then and what I write now there is clear evidence of maturity as a writer and thinker.
Which brings me to the title of this post. It comes from a conversation I had with a friend a couple weeks back about the importance of self-promotion. I like writing things and putting them out into the world, but I also do very little by way of promotion and struggle to do it even in applications where it is absolutely essential. The line was hyperbolic, a play on managing expectations downward.
The single biggest factor behind my reticence to self-promote is that when I put something out into the world I immediately become anxious about how it will be received — no matter how proud I am of the work.
Some fear is normal. Academic reviews are frequently sharp, cutting pieces apart with analytical skills honed through years of training. I have done pretty well passing my scholarship through the peer-review process, but deeply negative reviews still hurt and so I get a flutter every time I send something out. What’s more, I also recognize that the peer-review process is imperfect such that even a piece that passes muster there can meet with a negative reception once it goes out into the public.
Other aspects of my anxiety is more idiosyncratic. Imposter syndrome, the feeling of being a fraud about to be exposed at any moment, is rampant in higher education and I am no exception.
I have struggled with feeling inadequate since my time as an undergraduate at Brandeis, where I was at the “best” school I applied to but surrounded by people saying that they should have been at Harvard. When I went to graduate school, I ended up at a lower-ranked institution because I was turned away from the better programs. Multiple times. The Shadow-CV movement ostensibly meant to de-stigmatize “failure” a few years back had the same effect because it made me feel like I was failing wrong. Just recently, the discourse around Princeton changing its requirements for the Classics major once again reignited these insecurities. I went on to receive a Ph.D. in ancient history, but my B.A. was “only” Classical Art and Archaeology and Ancient History and thus distinctly remember being informed by an otherwise very nice individual that I wasn’t a “real” Classics major.
Then there is an aspect of self-assessment. While I have become a significantly better writer than I once was, I still don’t consider myself a good writer. I like to think that I am a good historian, but others are better — stronger linguists, more creative researchers, more clever thinkers. Comparison is not a useful exercise, but I am perpetually in awe when I read the brilliant work of my colleagues and a little voice whispers that this thing is better than anything I can hope to do. I would like to keep the appreciation for other people’s work, but ditch the little voice.
One thing I have done well is produce. The same habits that led to a half a million words published here have helped me put out a steady stream of articles and reviews despite heavy teaching loads, limited institutional support, and contracts without an incentive to publish.
Self-promotion will probably never be my forte. I’m good for a tweet and blogpost promoting my work and recently recorded what will be my first podcast talking about some of my research, but much beyond that my sense of reserve starts to kick in. What I need to remember is that there is a difference between promoting what one has done and promoting one’s own brilliance. The latter is self-indulgent vanity, but the former is normal, expected, and not incompatible with wanting to craft an academic persona based being a dedicated teacher, a generous and supportive mentor, a kind colleague, and, yes, a scholar.
I am proud of the work I have done and think that the pieces currently in the pipeline are better than what has already come out. There are a few pieces in the works, but the biggest one is this. A little over a week ago, I sent a complete manuscript to my editor for a book based on my dissertation. There is a long way to go yet, including another round of reader reports, copy-editing, indexing, and all of the little things that turn a manuscript into a book, but this also marked a major milestone in the project. The butterflies of anxiety immediately began to flutter, but I am immensely excited to be one step closer to seeing this project into the world.
Email is a brilliant tool. It takes virtually no effort or time to send an email that conveys a bit of information to one or more recipients almost anywhere in the world. They can then respond at their own pace, creating a thread that records how the conversation unfolded.
But email is also awful, a never-ending stream of small bits of information that can cause important tasks to get lost in the deluge.
I receive a relatively small amount of email compared to a lot of people, but I realized a few months ago that one of the great hidden costs of adjunct teaching at several different schools is that it dramatically increases the amount of necessary email management. For the past year or so, I managed three or four professional accounts on top of my personal one that I use for work unrelated to my academic employment. This work only requires reviewing an email, determining if it demands a response, and then deleting it, but now repeat the process for multiple accounts several times a day.
Then there are the email conventions. Email should allow for intermittent correspondence, but it has become practically an extension of instant messenger and group-think of lengthy email threads encourages people to engage in lengthier and lengthier responses that often defer the responsibility for actually making decisions. When the chair of a committee I am serving on needed to finalize a proposal, she skipped the email threads and asked several people who had responded to a pre-circulated draft to just sit down on a Zoom meeting and iron out our submission. In an hour, the three of us finished what could have dragged on indefinitely across email.
These are exactly the problems that Cal Newport tackles in his A World Without Email. His basic argument, which is an extended version of his “Is Email Making Professors Stupid?” from 2019 in the Chronicle of Higher Education, is that email and other “hive-mind” technologies like Slack are sapping the productivity of knowledge workers in nearly every sector.
The argument goes as follows: these hive-mind technologies were designed with the premise that more, easier communication is always better. You can better stay in touch with clients and customers; managers can better keep tabs on what is happening; workers can quickly get answers to questions. The technologies succeeded. They revolutionized the workplace and offices became increasingly streamlined. And then something happened. Email started to interfere with the smooth functioning of an office. Workers started spending less time doing what Newport terms “deep” work and more time handling managerial tasks like responding to emails and writing lengthy memos. Email allowed more immediate responses to clients, so clients began demanding more access, transparency, and immediate responses. Workers now able to check with a manager before making any decision did so, further bogging down processes and anxiety increased.
According to Newport, a computer science professor at Georgetown University, the problem is that these hive-mind technologies are actually too efficient. It is too easy to fire off an email, passing off responsibility for a decision or keeping everyone in the know. But that ease comes with an asynchronous cost. It usually costs little for the sender to send an email, but a lot for the recipient to wade through dozens of low-effort emails.
(In cases where there is a wide power differential and the sender is unsure of how their missive will be received are, of course, an exception.)
The flood of emails or other messages is likewise as distracting as the never-ending stream of updates from social media, taking our eminently distractible minds away from whatever it is we are working on.
Newport’s solution for these woes is not quite a world without email — that is a utopian impossibility — but to get as close to that as possible by putting in place systems that allow for asynchronous collaboration and communication without requiring an immediate response. Email will continue to exist and serves some important services, but it should be dramatically cut back in both volume and length.
A lot of Newport’s ideas come from and are tailored to the startup world, but they have a lot of crossover applicability to higher education (which is still my field).
For instance, Newport gives examples of employers who shortened the workweek contingent on the employees being able to dedicate their entire time on the clock actually working or structuring schedules where some or all employees are not responsible for email until after lunch. They key, he argues, is about setting and holding to expectations. If a project manager is the contact person for an entire project, there simply is no way to contact them by email. Better yet would be a centralized project board where anyone who needed an update on what was happening could simply look. If the system uses short daily (or weekly) in-person meetings to give updates, then the query can wait until that meeting. Any such system, Newport argues, would require empowering workers to make decisions within their purview, but will create better outcomes long-term.
I don’t do most of my work in a collaborative workspace like the ones Newport describes here, but many of these same principles apply. Take my daily writing time. I can have minimal distractions (animals, the bustle of a café, music), but nothing narrative, no discussions, and certainly not the digital updates. For those blocks of time, usually an hour but sometimes longer, I turn off my social media, close my email, and tune out the world. Anything that arrives while I’m writing can wait.
Other suggestions in A World Without Email are more directly applicable.
One example: the “scrum” status meeting . These meetings happen several times per week and are held standing up to encourage brevity. At each meeting, the team members answer three questions: (1) what did you do since the last scrum?; (2) do you have any obstacles; (3) what will you do before the next scrum. If a team member needed a longer meeting, it could be set at this time. Newport describes the scrum as an ideal way to manage an ongoing project in a company, but I could see using a modified version (maybe twice a week instead of daily) with students working on theses and independent projects. These projects are usually developed with long regular one-on-one meetings, but the result is siloing the educational process and adding significant time commitments to a weekly schedule. By contrast, a scrum might show the students that they are not working on these things in isolation, the regular contact builds low-stakes accountability, and making these standing meetings cuts down on scheduling emails.
Newport also argues for automating and outsources as many processes as possible in order to save time that could be better spent doing deep work — or no work. Sometimes this requires money, such as how he describes hiring a scheduler or administrative assistant to handle tasks that might not be in your wheelhouse. I appreciated this suggestion, even if it struck me as analogous to how many basic necessities in life are cheaper if you’re able to afford to spend a larger total amount up front by buying in bulk.
More relevant to my position was the suggestion to automate as many tasks as possible.
At the end of the most recent semester I floated an idea to use flex due-dates for major assignments in my classes, but had been thinking about how to actually administer the policy without a flood of emails. The answer, I think, is creating automated systems. My current thought is to create a Google Form for every major assignment, with link embedded on the assignment guide and on the course website. To receive an extension on that assignment, all you have to do is fill out the form before the due date, answering just a couple of questions: name, assignment, multiple choice for how long an extension you want, and maybe a brief explanation for if you selected “other.” Rather than collect however many emails to respond to, I will have all of the information for each assignment in one place. Likewise, even if I return to grading physical papers, I will request two submissions, an online back-up that counts for completion, but then physical copy that can be turned in the following day for grading. Each of these policies requires a small additional step at set-up, but could streamline the actual process, and I hope to find other processes to similarly automate in my day-to-day job and also should I find myself leading a committee.
My only major of the book is mostly a function of the intended audience. My issue was with how Newport framed productivity as an abstract but ultimate ideal. This led to consequences in the text that run crosswise to what he is actually arguing. At one point Newport talks glowingly about an obsolete office setup where secretaries handled mundane tasks like scheduling meetings, transcribing memos, and handling routine communications. His point is that removing these tasks frees the knowledge worker to do deep work (that they are being paid for), but the value to that worker is given significantly more space than are the mechanics of hiring at a fair wage to do the job. He believes the latter (or says so in the text), but mentions it only in passing. Likewise, the value of deep work, Newport argues, is that you can reject the pressure to work exceedingly long hours, but the focus is on how to produce more. I understand why he wrote the book this way, but given the long-term trends that show how productivity has vastly outpaced wages, I’m not convinced that productivity out to the be the primary objective and thus found the evidence for improved workplace satisfaction to be a much more compelling case for cutting back on email use.
A World Without Email is a manifesto, but a timely one that has given me a lot to think about going into my new position since a new beginning is a great time to implement the new processes and protocols that he suggests.
This post flitted between one where I think about academia and where I write about books, so I might as well continue here. I just finished Andrea Stewart’s excellent debut novel, The Bone Shard Daughter, and am looking forward to starting Patrick Radden Keefe’s Empire of Pain next, an investigation into the Sackler family and the opioid crisis.
The semester just ended, which means it is time to review course evaluations. These feedback forms are notoriously problematic, but I encourage students to give me feedback and take what they say seriously. More than just that what students write can end up in a document I use for job applications, these are formative evaluations that can help me refine my practice. That is probably why a single negative review can cause such a sharp sting.
Reader: I got one of those this semester.
But I don’t want to write about that. Instead, I want to share something that happened this semester: a student wrote down what I said. Not note-taking during a lecture, but writing down specific things that I said and then handing me a copy of this list on the final day.
Here’s a taste:
“If a monster’s just out there fishing or something, is it a monster?”
“Moose are big and scary as opposed to brown bears who are big and stupid.”
“Remember people: only barbarians wear pants.”
“Cassandra’s curse is that she’ll never be believed. I’m sure many ladies can relate.”
[speaking of Theseus] “He prays to Big Ocean Daddy.”
Sometimes discussions turn into improv and I say things specifically to prompt a response, here in a seminar on monsters, monstrosity, and classical mythology. Given that this is the sort of thing I did in classes with my favorite professors in college, this was immensely flattering — if also momentarily terrifying.
My problems this school year were mild when compared to most people.
I felt like a zombie for much of the last month because of the grind.
The spring semester felt a little easier in some ways. I mostly adhered to my resolution to KISS, which led to a more regular schedule and had noticeable benefits, particularly in my online class. I was also doing much less course building than in the fall, which allowed to focus more on best practices and maintain flexibility without the course devolving into a haphazard mess.
At the same time, a lot of these improvements were offset by the simple fact that both I and my students came into the semester already battered by an exhausting fall. Right around the time when we would have received our usual spring break I noticed a dip in everyone’s energy levels. I prescribed several “mental health days” late in the semester to try to account for this, but they were just a drop in the bucket. We kept going because that is what we had been told had to happen, but more than one student explained to me that whatever work they were giving me was perfunctory because they just needed the semester to be over.
I had a lot of sympathy for that position.
But I also had a ton of students who kicked ass this semester. Grades are in no way a reflection of personal worth — a good friend of mine aptly describes them as a professional evaluation of performance within a narrowly circumscribed realm — but I had some students earn among the highest grades I have ever seen by embracing not merely the grade rubric, but also the spirit of a class. Some really improved their writing over the course of the semester while also thriving in the open-ended discussion boards. Some threw themselves into their unEssay projects, like the student who melded the topic of my monsters course with the scientific literature review of her own major and produced a 17-page review of representations of mental illness as monstrosity in popular media, even turning the project in early. Others did really excellent work while parenting in a pandemic. I was disappointed that some students got lost along the way, but am unspeakably proud of everyone for making it to the finish line.
In light of these challenges, I have been wrestling with what, if anything, I want to carry forward from this year. The problem I have been facing is that most of what worked well were things that I had already incorporated into my courses. I am endlessly tinkering with tools and ideas to increase engagement and reduce cost. I was already using OER platforms and distributing my materials through the learning management system. I experimented with Discord without much success, but is something I would have done under other circumstances anyway (and probably will again).
Other changes, like going to entirely paperless grading, I abhor. There are no words to express how much I hate grading online. Quick quizzes where most of the work can be automated are one thing, but I struggle to give papers I grade on screen the attention they deserve. Using an Apple Pen on the iPad was, in theory, a step closer, but my clumsy hand-writing just got messier when the platform would even save what I wrote. Then there were the times when the platform refused to load documents. I understand the accessibility issues with asking students to provide hard-copies of their work, but I see enough advantages that I am going to return to grading on paper as soon as safely possible.
I have only been able to come up with two pandemic changes that I want to make permanent.
First, Zoom options office hours. Unlike my aversion to digital submissions described above, I only see advantages to having virtual options for office hours. This is not to say that I’ll eliminate the in-person sessions, but also having virtual options opens up possibilities for students who have other demands on their time and builds in flexibility that just doesn’t exist otherwise. This one will just require some thought as to the logistics (what happens when someone gets stuck in the virtual waiting room while I’m working with someone in person?) and boundaries.
Second, flex deadlines. This one is more of a work in progress based on the approach that I developed for deadlines this year. Basically, a course needs to have specific deadlines so that work is spaced out over the course of the semester and everyone roughly stays on track without falling too far behind.
In past years, I included on my syllabus a draconian late penalty not because I wanted to enforce it but because it used fear to get students to turn work in on time. In recent years I kept that policy, but added a once-per-semester freebie three-day extension. With the pandemic this year, I didn’t feel comfortable limiting that extension to a one-off and felt obliged to accept all late work. This meant significantly more book-keeping on my end, but worked in general.
What I want to do going forward is formalize a policy that was ad hoc this semester. My rough draft:
This course uses flex-deadlines for all assignments except presentations or those due on a weekly, recurring schedule (e.g. not quizzes). All major assignments (papers, take-home exams, projects) have a checkpoint in the syllabus. This date reflects where the assignments should fall in the semester based on the material we cover in class and giving adequate space between this and other assignments. By the time of this checkpoint you must either a) turn in the assignment, or b) requested an extension of e.g. 3- or 5- days. Longer extensions are possible on a case-by-case basis.
Submitting late assignments without communication with receive the following deductions: 0–24 hours late: 0%; 25–48 hours: -5%; an additional 5% reduction for each subsequent 24-hours, to a maximum of 50% off at 11+ days late.
There are some wrinkles I need to iron out here. This policy is harder to enforce with physical assignments (in the past, I have counted it complete with an emailed copy while requesting a hard copy for grading), for instance, and I’m not in love with the percent deductions (I like the 24-hour grace period and capping how much a student can lose for late work — if I give an assignment, I think there is value in the student completing it). I also ran into a problem where students were surprised when the end of the semester introduced a hard deadline. This one may be as simple as setting an earlier due-date for the final submission so that there is a cushion before I am up against the date I have to turn in grades.
However, I see four major advantages to this policy or something like it.
It gives students more agency over their schedules. If college ought to be treated as a job (an unrealistic standard, in my opinion), it is better described as students managing four, five, or six (or more) part-time jobs simultaneously. Time management skills are important to cultivate, but I could say as much for myself.
These are not open-ended extensions, but function something like a contract in that the students have to look at their schedule and tell me how long it will take them to get the work done.
I am not putting any burden of proof on the student. I don’t need a doctor’s note/notarized letter/obituary. You need more time, I give you more time. The only requirement is communication, which, I hope, will improve outcomes overall by making other communication more likely.
It better corresponds to how I grade than insisting that student must have their work in on time. I rarely sit down to grade bit assignments assignment as soon as they come in, so short extensions still mean that everything comes in before I have finished.
The best thing I can say about the 2020–2021 school year is that it is over. I am excited to start the next chapter of my career at Truman State University in August, but a part of me is going to miss these students with whom I went through so much. Now we get to celebrate:
Every couple of weeks it seems something sets academic Twitter buzzing. Yesterday it was a well-established professor with a light (2–1) teaching load who shared three secrets to having put out 75 publications since 2005 and invited her readers to respond with which of her strategies were the hardest for them. I quote:
I sleep 8 hours a night.
I write for 1–2 hours every weekday
I don’t get in my own way.
I don’t think that the author meant anything malicious by her tweet, but the self-congratulatory framing seemed tone-deaf at a time when a lot of people are struggling. Many academics I follow on Twitter pushed back, challenging her the privilege of such a small teaching load and secure employment, debating whether we ought to measure our academic worth by simple volume of publications—to say nothing of how disciplines count different publications—and still others cast side-eye at what exactly “not getting in one’s own way” means.
When I saw the tweet I mostly just felt tired.
I’m not going to rehash my CV here — I keep a public version on this site that I update every few months if anyone cares. Suffice it to say that since graduating four years ago I have published more than some people, but less than others, while also teaching a whole bunch of courses on part-time contracts at multiple schools.
I exercise daily, make sure to read outside of work (because it is something I enjoy), and try to sleep 8-hours a night. I’ve even had more success with the sleeping since the start fo the pandemic and have started actually taking one day entirely off each weekend!
(Okay, fine. Most weekends.)
I also write for about an hour almost every weekday. The exact time changes, but I try to carve out an hour or two, usually in the morning, where I turn off email and social media in order to just wrestle with words.
It wasn’t always like this. When I started tracking the time I spend writing a few years ago I was in a very different position than I am now. Fresh off my dissertation and only teaching one course a semester, I had time to write and wanted a way to keep myself accountable. As my teaching load snowballed, I found it harder to find time to write and the amount of time I gave my writing plummeted. About the same time, I discovered that I missed that time I spent writing in much the same way that I miss physical exercise when I go more than a couple of days without doing anything. My recent writing sessions have been motivated in part by the terror of several deadlines that just passed for projects I committed to delivering, but I also find peace in the daily practice separate from those commitments.
I want to do good research and to have it taken seriously, but I also can’t define my academic existence by my publication record. My post-PhD life has been defined by teaching positions, often without support for research or publication. I have continued to do both, but approaching them as a second job demands finding other measures of academic success. I can block off time for writing, but the fact that my teaching contracts demand a lot of the time I would otherwise dedicate to focused reading means that I haven’t had the brain space recently to fan the spark of an idea into fully-realized papers. At the moment this isn’t much of a problem given that I am in the final stages of completing projects, but it does mean that my research pipeline will (temporarily) run dry.
But guess what? I’m okay with this! I have jotted down notes for a couple of articles that I would like to dig into, to say nothing of ideas for three more book. None of these things are actually in a research pipeline right now so much as sitting on a shelf collecting dust. Inevitably some of these will never amount to anything, whether because I get distracted by other shiny objects (projects) or because I will take them down to find that the idea half-formed years ago just doesn’t work, but others will eventually enter into the pipe and emerge sometime down the line.
The reason I felt tired when I saw the original post is because I momentarily felt the pressure that comes with using the raw numbers of publications as a metric of academic success. I’m tired enough as it is, I don’t need any more pressure.
As I wrote above, I don’t think the author meant anything malicious by her comment — and may have believed she was trying to help contribute to some sort of self-help productivity discourse that operates in some sort of abstract space where the real world doesn’t apply. This discourse operates in a space of extreme privilege, but it also both responds to and reinforces an academic culture of publication where the goalposts are forever just out of reach. Whatever you demonstrate to be your pace becomes an expectation and however fast you publish you could have put out one more. After all, should we not always strive for maximum efficiency and ever greater production?
Of course we shouldn’t. Fast scholarship isn’t the same as good scholarship.
Now fast scholarship is not actually what the original tweeter called for, but by setting the volume of her publications as a the metric of success she has nevertheless implied that we ought to bow to the pressure to produce more and more quickly. I might be be able to reach 75 academic publications (including reviews), but I also may not ever publish 75 academic pieces in my career. Not only would either of these outcomes be fine with me, but it is also critical to resist the simple quantification of academic production.
Working in higher education has enough challenges already. Rather than focusing on someone’s prodigious output and trying to replicate their method, every discussion of academic productivity needs to start with sustainability, support, and the academic communities we want to create.