First day fragments

My fall semester begins in earnest today, with the first session for both of my classes. I don’t have a single back-to-school post idea, but rather a bunch of loosely connected ones, so decided to go with a fragmentary format.

“I didn’t get everything done” is a standard lament for academics come late August, bemoaning some combination of the cult of productivity, human limitations, and the difficulties of researching during the school year. I am no exception. I set an ambitious schedule for reading scholarship beyond my immediate research, but only managed to read a handful of books and articles, and a couple of books on teaching.

There are a couple of explanations for this failure. One is that the summer quickly became very busy, with multiple family trips that had less down-time than anticipated, meaning that there was neither opportunity for reading nor for a deep recharge of my batteries. Another was that I taught an intensive summer World History course in June, so much of my spare reading went toward preparing for class. A third was that seemingly every spare moment around these time commitments was sucked up by working on revising my dissertation as a book. My goal for that was to have it under review by the start of class, but I missed that deadline, too. At least I am in a position to meet my revised goal of August 31 for that one…

ΔΔΔ

There has been a movement in recent years to normalize failure, particularly in academia, leading to people sharing their failures on Twitter over the last week. I mentioned there that I respect the movement, and appreciate the baseball analogy where if you’re a batter and only “fail” (make an out) at the plate six out of every ten times, you belong in the hall of fame. (There are obviously other statistics from baseball that could make that more or less extreme. If you’re a pitcher and batters swing and miss just 20% of the time, you’re incredible, but if that is the percentage of the time you throw strikes, then you probably quit playing in little league.) I respect the impulse to normalize failure because it is inevitably going to happen, regardless of how generous and kind the academy becomes. Everyone is going to experience article/grant/abstract/job/proposal rejections for a host of reasons. Sometimes those reasons are good (the project needs more work), sometimes they are petty, and a lot of the time is a simple numbers game that has almost nothing to do with what was proposed.

My shadow CV includes all of these things, including four article rejections, two more revise and resubmits that were later accepted, at least seven paper abstracts rejected that I can think of off hand, too many funding applications for fellowships and travel grants to count them all. And I am only a little more than a year removed from graduating with my PhD.

At the same time, I found the push to normalize, share, and celebrate failure on social media hard to handle. The main reason is that while failure is normal in the academy, and rejections can be handled deftly with an eye toward improving the project for the next time around, it is also a sign of privilege to be able to reflect on this Shadow CV. It is coming from someone still “in the game”, as it were, and I heard with every round of shares “this is what you *should* have been applying for.” As in, your failures themselves are inadequate because the “stars” fail bigger and better.

Then pair this with the part I left out of my Shadow CV that are the all jobs I’ve applied to without making the long list. The Shadow CV is meant to normalize failure so that people can better overcome the natural fear of it and thereby reduce anxiety, but when mixed with too few academic jobs to go around and the sheer amount of time that applying for them takes, it just exacerbated mine.

ΔΔΔ

I’m looking forward to teaching both of my classes this semester. One I am teaching my own syllabus for the second time, the other I am teaching as the sole instructor for the first time. I had the chance to teach on my own a little bit during graduate school, but this is my second year of continuously teaching my own courses and reading up on pedagogy, so I am now to synthesize some principles for my classroom.

First Principle: Learning, not grades. I do not care about grades beyond making sure that I have created a reasonable and achievable grade scale for the class. My goal as a teacher is to help students develop practical skills such as writing and the ability to understand the world through critical analysis and synthesizing information. Toward that end, I believe that many common assessment tools that are built for scale are next to useless in actually assessing learning. I design my classes around assignments that require students to develop arguments through writing and that build on each other so that students can show improvement in tasks that are not easy.

Second Principle: Empathy. Students are adults who have a larger number of demands on them than even I did when entering school fifteen years ago. I aspire to treat them like adults with responsibilities, just one of which is my class. College is “the real world” where students are on their own for the first time, and I want to be a mentor/coach/guide. This means having empathy, and encouraging them to take ownership of their education by talking with me when they have a conflict or need help.

Third Principle: Engagement. “Meaningful learning experiences” is a hot topic, though my mother assures me that this has been the key phrase for many decades now. Every class is going to be selective in the material it covers, so I see my job being to give students the tools to learn more and to pique their curiosity to want to do so. This means developing activities and assignments that require engagement, through games, debates, and projects where students take ownership of the material. This has not been the easiest task for me as someone who found history books thrilling in high school, but something that I am committed to improving in my own teaching.

There are others, but these are my first three.

ΔΔΔ

Without further ado, let the semester begin!

Minds on Fire – Mark C. Carnes

Earlier this year I crowd-sourced a list of teaching materials. Now that the fall semester is imminent, I am finally getting a chance to sit down with the list again in order to prepare for my courses.

The subtitle of Minds on Fire is its mission statement: “how role-immersion games transform college.” The book itself is a manifesto for Reacting to the Past, serving to defend and justify the games developed by the consortium.

Carnes’ core contention in Minds on Fire, and the underlying principal behind Reacting to the Past, is that students are engaged in “subversive world[s] of play” that range from video games to Zombies v. Humans to fraternity events. On the other end of the spectrum “all classes are kind of boring.” The solution, Carnes argues, is to harness the subversive worlds of play toward academic ends; that is, give students competitions and games that tap into their natural inclination for this subversive behavior and get them to do more work without thinking about it as work. Teachers facilitate the games, but then step back and empower the students to take the reins.

After setting out these principals, Carnes dedicates much of the book to laying out the advantages and countering the criticisms of using games in the classroom. There are chapters on how Reacting games teach morality and leadership and spontaneously produces community, things which are often touted as the purpose of a humanistic education or baked into college mission statements. Another section rejects the positivist contention that the past is a fixed stream and that opening the possibility of changing the past undermines history education. In each instance, the philosophical and pedagogical ideas are buttressed by excerpts from interviews with students who went through Reacting courses.

Minds on Fire is a convincing read, though I should say that I went in predisposed to think that as someone who has always balanced a fascination with history books with hours of subversive play. Carnes acknowledges, but also skims past, some courses are not going to be suitable for Reacting games and that not every Reacting exercise will be a raucous success. Nor is there much acknowledgement that Reacting is a radical proposal that seeks to achieve a fairly standard aim: significant learning experiences. Reacting classes, by not seeming like school work, give students ownership over their education and “trick” them into having experiences that cannot be faked or cheated.

There are other means to this same end, but there are also numerous classes where Reacting is a particularly effective way to grapple with issues, and I think it is no coincidence that some of the success stories came from Freshman Seminar or great ideas sorts of classes. I also think that long-running games could be particularly successful in discussion sections as a complement to lectures.

In sum: there were times that this book was too much of a manifesto, but while not every course needs to be a Reacting game, but every course can take lessons from Minds on Fire.

AP World History (Ancient)

The College Board received push back a couple of weeks ago when it announced changes to the AP World History curriculum, making the course begin in 1450. Critics online gnashed their teeth about a number of things, raising the legitimate concern that this would further marginalize the pre-modern world and that the chosen date, such that it meant anything, would default to a Euro-centric world view. The board responded this week by announcing that the new date is 1200, not 1450. Critics gnashed their teeth, albeit also in befuddlement at the seemingly arbitrary date.

(For what it is worth, the College Board’s stated explanation for the date, that it will allow “a study of the civilizations in Africa, the Americas, and Asia,” does check out. Now Genghis Khan, Mansa Musa, and the rise of the Aztecs all fall within the range. 1200 starts the course in media res, but that was inevitable when you put a start-date on a course.)

Reading the College Board’s announcement about the changes, I am of two minds. First, I am sympathetic when they say, based on the feedback from teachers that the current model is unsustainable because they are trying to do too much.

The current AP World History course and exam attempt to cover 10,000 years of human history—from the Paleolithic Era to the present. In contrast, colleges manage the unique breadth of world history by spreading the content across multiple courses.

The announcement is a little misleading when it says what college courses do and do not do—I did have a World History from the beginning of time to 1960 course in college—but, from the other side of the table, these broad strokes courses are incredibly hard to teach. Even “just” teaching a world history or Western Civilization before 1500 is covering a laughably enormous swathe of time, which is also the reason I have no pity for the US history professors who complain that there needs to be another mitotic division of their survey sequence, taking it from two to three courses. It has, however, been my position for a while that if I were made Grand Poobah of History Curriculum, I would invert the current paradigm and only teach the survey courses after students had been exposed to historical methods in specialized courses. The idea here is that by going from the specific to the general rather than the reverse, the students are better prepared to appreciate historiographical arguments and big themes.

The problem with this approach is that it is not scale-able as part of a standard test scheme meant to grant college credit. The situation the College Board finds itself in is quite the bind. It needs a single survey course to stand as a substitute for a college course because fragmenting the courses would neuter its ability to be a standard test and therefore undermine credibility. At the same time, the enormity of the course makes it difficult to teach the material in such a way that prepares the students to succeed at the work asked of them on the test. I don’t love this incentive structure with regard to student testing, but given the incentive structure in college courses I at least understand it. Moreover, in looking at old sample questions, the test itself isn’t bad in terms of the skills it is designed to measure. But it is also a lot. Obviously something needs to change.

This does not, however, mean that I agree with the changes the College Board made. My main issue is the decision to prioritize modernity, particularly because the AP European History, which starts in 1450, has already done the same thing. At this point I could start declaiming, ridiculing the absurdity of teaching a world history course that takes for granted the miracle that is agriculture, that conveniently forgets that Rome laid the groundwork for modern Europe, or that doesn’t bother trying to understand the founding and development of *any* of the world’s major religions. Yes, there are a couple of things that happened after 1200 (or 1400) that are important, but these are all built on precedents and developments that came before.

The college board has put out that it is open to creating a second AP World History (Ancient) course, which, despite the awkwardness of the name (congratulations, Richard I of England, you’re an ancient king!), is a fine ambition. But here is the thing: I am skeptical of how quickly yet another AP course can be developed and instituted, let alone how widely it would be picked up. Things have changed since I was in high school, but when I was coming up I didn’t even have access to one AP World History course, let alone two. I got my start with classes on all things ancient through my Latin teacher. Now I am fortunate enough to teach ancient history to college students and am consistently impressed with how many students from all sorts of disciplines come out to take my classes.

Maybe I am wrong and this interest will prompt dedicated high school teachers to make the second course come to fruition, but in the meantime I cannot help but think of this as a missed opportunity on the part of the College Board. There had to be a change to the AP World History course, but instead of even temporarily erasing antiquity, it should have kept the earlier portions, perhaps as an AP World History (Foundations), and developed (Modern) as the secondary offering.

Pedagogy in the Humanities – a reading list

On the list of things I don’t really have time for, but want to do anyway, is spend more time reading about the mechanics and craft of teaching. I am particularly interested in issues of course development and planning, active learning, student engagement, and assessment. I sent out a tweet for book suggestions and in the first couple hours it was posted more people boosted the signal through retweets than suggested bibliography, though suggestions did begin to trickle in.

It has been about twenty four hours since I sent out that request; here is my reading list so far:

  • Ken Bain, What The Best College Teachers Do (Harvard 2004)
  • Peter Brown at al., Make It Stick (Harvard 2014)
  • James M. Lang, Small Teaching (Jossey-Bass 2016)
  • Marc C. Carnes, Minds on Fire (Harvard 2014)
  • Jay Howard, Discussion in the College Classroom (Jossey-Bass 2015)
  • L. Dee Fink Creating Significant Learning Experiences (Jossey-Bass 2013)
  • Susan Ambrose, How Learning Works (Jossey-Bass 2010)
  • bell hooks, Teaching to Transgress (Routledge 1994)
  • Nancy Sorkin Rabinowitz and Fiona McHardy (edd.), From Abortion to Pederasty (OSU UP 2015)
  • John Gruber-Miller (ed.), When Dead Tongues Speak (Oxford 2006)

Jay Dolmage, Universal Design: Places to Start, Disability Studies Quarterly 35 (2015)

BU Proseminar in Classical Pedagogy, resources curated by Dr. Hannah Čulík-Baird.

This list will be updated. Additional suggestions are welcome in the comments.

Preparing for class and my undergraduate experience

The process of preparing for class makes me try to remember about my undergraduate courses. In terms of specifics, the answer is not much. Obviously I absorbed a good deal of content that I am now able to speak with varying levels of confidence about, but much less stands out about the actual classes.

Take, for instance, the equivalent of the course that I am now teaching—a survey of Greek history. I remember my professor’s opening spiel about the etymology of history and how it comes from a root that has to do with judgement, I remember bantering with a friend of mine who also went on to get a PhD in ancient history, and I remember one of the other students making a diorama from wax sculptures after taking the wax from individually wrapped cheese “cuties.” And some of those memories could easily be from other classes with this professor.

Most of all, though, I remember loving the class (and other classes like it) because the professor gave us room to explore long sections of ancient sources, even to the extent of seeming disconnected and disorganized. In fact, I remember having an argument with a fellow student in a class in another department altogether because this student hated the disorganization, feeling that it meant that she wasn’t learning anything. I vehemently disagreed at the time, which was something of a running theme in a course that had us working in a group for most of the semester. Believe it or not, we actually worked pretty well as a team.

Before laying blame on the professor, though, reflection shows this limitation of my memory is true even in courses with amazing lecturers. For instance, I have clearer memories about my favorite college lecturer declaring that blue exam booklets were the ideal form for writing lectures in, the fact that the Anatolian peninsula is, north to south, the international measurement unit “one Kansas,” and his apologies for the boring but necessary excursuses on medieval agriculture. Or that in the last week of class he never failed to take a photograph with a disposable camera and that I invariably left class every day with an aching hand. That pain and some later sweat ensures that I can go back to my notes if necessary, but, once again, I don’t remember much at any given moment.

I could go on, but there is one particular exception: language classes. The memories are almost certainly just as flawed, but I remember the act of being there, the feel and the look of the book chapters, and all of the things Homer taught to his brother. More to the point, my memories of language courses are clearer regardless of whether I liked or disliked the teaching styles of the professors. I don’t know why, exactly—maybe I found languages more difficult and so the classes left a deeper impression or the way that I learned the languages was tied to the classroom in a way that history never war—but the division in my memories is real.

Obviously I learned facts from these courses that, ten years later, have been baked into the collection of knowledge tucked into the dusty corners of my mind or else that I have forgotten. I also learned note-taking skills, research habits, a critical eye for source criticism, and something of writing. (Less by way of common sense, however, even if one of the professors mentioned above did try to warn me off of graduate school.)

I think about all of this when I am preparing for my own class. My class is just too large to toss the textbook in favor of embracing the glorious confusion of reading sources together, and I feel some responsibility cover a certain number of topics in a survey of Greek history. I tend, therefore, to err on the side of structured lectures with a powerpoint presentation modeled on the US history survey courses that form the large portion of the teaching styles I have seen in recent years. There is only so much that can be covered, so, in this sense, I look to give students a taste along with some tools to learn more.

At the same time, though, I think back to being encouraged to engage in forms of source analysis and informal, seminar-style debate with great fondness. Unstructured though those may have been, they also reflected active learning at its finest. As much as this form of class worked for me, ironically, it often takes a leap of faith for me to try it from the other side of the table (so to speak). I will probably never abandon lectures altogether in a class like this where there are details that I hope will encourage students to go out and learn more, but at the same time I am always looking for new activities where the students can grapple with the primary material together or on their own because, more than the lectures, that is often what I remember being most useful from my undergraduate experience. This experience didn’t do me any favors in terms of downloading and debating historiography for graduate school, but in the more universal tasks of evaluating how a source is presenting the world and challenging its prevailing biases, it is absolutely essential.

(Re)visions and Assignments

Every student paper should be revised. More than once. In an ideal world, that is; in the real world there are problems of scale and deadlines.

Periodically I receive an request from a student to revise a paper in return for extra credit. In the past when teaching in surveys of American history with up to a hundred students at a time, I feel obliged to reject these requests. I would love it for students to revise their papers, but extra credit is not something I can extend to just one student in good conscience and there isn’t enough time in the semester to let every student do this unless it is built into the course. On the one hand, I feel bad about rejecting some of these requests since I am acutely aware of the challenges facing the current generation of college students; on the other hand, though, the requests are framed in terms of getting a higher grade, not in terms of education.

This disparity comes in part from the nature of these assignments. I suspect that nobody has looked at a survey-level essay on the changing conceptions of race in America from 1865 to 1925 as an opportunity to write a brilliant and incisive critique of race in America. Even if the author has a fiery passion for the topic, the prompt and supporting materials don’t lend themselves to it. The disparity also speaks volumes about how courses like this one are treated. They are a grade, not an opportunity to learn about American history or learn practical skills such as writing or rhetoric.

Returning to the nature of the assignments, one-off submission that return marked and assigned a grade lend themselves to thinking about the assignment in terms of the grade instead of in terms of process. I understand the counter argument that history classes are for teaching history and not for teaching writing, particularly in these large survey courses. And yet, history is fundamentally discursive.

This fashioning of history, along with how we remember history, is going to be a point of emphasis this fall when I teach a survey of archaic and classical Greek history. I am going to do this not only because of the recent and not-so-recent appropriations of antiquity for political agendas, but also because I hope that pushing people to think about these issues in a Greek context will make it possible to think about in our contemporary context.

I am also planning some opportunities for my students to revise their work, made possible in large part because of a smaller class size. As of right now the idea is to give an option for students to revise at least one of their assignments for a higher grade, as well as making that type of assignment recur once more later in the semester in order to maximize the benefit for the student. The plan is to have revisions take place in two phases, with the first being that they come meet with me to discuss the assignment, before then making revisions based on both the written comments and conversation. My hope is that in addition to setting assignments that push the students to write a decent amount, adding this (optional) revision stage will meet the students halfway toward thinking about assignments qua grades. That is, maximize the students’ opportunity to earn a higher grade while underscoring that writing (and thinking) is a process that doesn’t happen simply by vomiting words onto a page.

Confusion – Stefan Zweig

Editorial note: there will be spoilers in the penultimate paragraph of this post as it is impossible to express my concerns with this novella otherwise. With the understanding that some people disapprove of such reveals even in a ninety year old book, I have kept these until the very end..

This was the first real shock that, at the age of nineteen, I experienced—without a word spoken in anger, it overthrew the whole grandiose house of cards I had built during the last three months, a house constructed out of masculinity, student debauchery and bragging.

Zweig’s Confusion—not a direct translation of the original title—is a novella published first in 1927 that I am of two minds about, one that deeply appreciates some of its psychological observations and graceful structure, and one that is deeply troubled by its politics. In form, Confusion is an eminent professor reflecting on his intellectual life on the occasion of his Festschrift, a publication that memorializes and celebrates his career. Far from the parade of successes that the accompanying biography records, Roland, the professor, recalls a time when he was far more interested in women than in his studies and how he ended up attending a rural university away from the temptations of Berlin. Thus he says:

Everything it says is true—only what genuinely matters is missing. It merely describes me, it says nothing real about me.

It is at this rural university that Roland is mesmerized by the passion of an old English professor who awakens his intellectual curiosity.

Soon, the professor helps set Roland up in the building where he lives with his wife and Roland offers to help the professor by taking dictation on his magnum opus: a history of the English drama in the age of the Globe Theater. The two begin to work on the project diligently, but Roland finds the professor difficult to work with; some days the professor is hale and strong, other times distant and cruel, while still others he is absent altogether. It is during one of these intervals that Roland ends up involved with the professor’s wife.

The heart of Confusion is the relationship between Roland and the couple who live below him, that is, the professor and his wife. The former is Roland’s intellectual father, while the latter takes on the roles of mother, lover, and reminder of his past insecurities.

Zweig’s greatest strengths unfold in the turns in Roland’s relationships. He shows how a student might have limitless potential and how a teacher can (in some cases) change a person’s trajectory, but, even more importantly, Zweig builds into the structure the idea that an intellectual career does not unfold in terms of linear successes. Confusion in this regard is an excellent, subtle coming of age story.

And yet, I had deep reservations about Confusion that far outweigh any I have had about his other work. The dramatic climax in Confusion comes when Roland is tearing himself up over his transgression with the professor’s wife, only to discover that his advisor professes to have no control over what she does, just as she has no control over him. Far from a modern sense of an open marriage, the professor reveals in so many words that he is gay. This revelation fills in the gaps as to the snide comments people had been making about Roland’s relationship with the professor, but my problem wasn’t either this or the suggestion that he was working in oblivion at a rural school because of his sexual tendencies. My issue came in how the professor describes himself to Roland, in that he talks about both the joys and the challenges of constantly being surrounded by young, attractive, and vibrant young men and that it was for this reason that he sometimes went absent. As a plot device it worked well-enough, but it was both a regressive representation of homosexuality and troubling in terms of how it linked intellectual and sexual relationships. Moreover, I found it distasteful because of how it would have played had it been a male professor and female student, which is already a topic in the realm of troubling issues of gender politics on campus.

I don’t want to diminish Zweig’s accomplishments in Confusion. From the outset, I often found myself nodding appreciatively at his observations, but, as the trajectory of the plot became increasingly clear, I became increasingly soured on the entire story.

ΔΔΔ

I also recently finished N.K. Jemisin’s The Broken Kingdoms, the second book in her Inheritance Trilogy and am currently reading Glen Weldon’s The Caped Crusade.

Finis

Content note: what follows is a sincere reflection of my feeling dispirited at my current situation and how I am grappling with ways to move forward. This has been building now for months and I have been hesitant to write about it openly. Everything adds up to a sense of despair that bleeds into this post, but I also recognize that many of my issues are coming from a place of privilege.

More than a week in the making, this post has proven–and continues to prove–almost impossible to write, which, in turn means that most of what I had originally intended to write has been jettisoned, perhaps to be picked up from the cutting floor sometime down the road. However, the starting point remains precisely where it would have a week ago, so perhaps I ought to begin there.

A bit more than a week ago I cleared the last remaining academic hurdle for my doctorate, defending my dissertation first thing Monday morning. This means that I am no longer ABD (all but dissertation) and now just ABB (all but bureaucracy). The dissertation defense should be–and was–something to be celebrated and I am more than a little relieved to have finished this process. Another post would and will go into reflections on the dissertation process because I believe that such introspection is not only good for me, but might be valuable to others going through the same process. And yet, without the immediate demands of the dissertation, the specter of the future has cast a pall over my sense of achievement.

I entered and progressed through graduate school clear-eyed to the brutal employment statistics in higher education. I can see in my mind the trend lines for full-time employment, the rise of contingent faculty, and costs of higher education and in some ways this shaped my experience in graduate school; for instance, I came to University of Missouri precisely because my department offered funding for the MA. I also maintained that I was willing to work outside higher ed, should I not get a job teaching. At the same time, I thought “why not me?,” and so set about doing the sorts of things one does in graduate school in order to be competitive on the academic job market. I am not here to boast of my accomplishments and I made mistakes along the way, but I also think, inasmuch as I was able, I put together a competitive resume with a body of work that continues to grow.

Then I started applying for jobs. Suffice to say that it has not gone well.

I am under a month from graduation, once again facing an uncertain future and feeling stuck in neutral. On the one hand, I am still applying for teaching positions at colleges because this is still something I want to do with my life; on the other, though, it is a lot easier to be cavalier about resiliency on the job market when you’re not worried about how you’re going to eat next month.

I could lash out, casting blame for my current predicament. I could throw in the towel, abandon the dream of teaching at the college level. I could dig deep for resolve to keep on with the types of activities that would be attractive to a future academic employer.

I am closest to the last option, with a hearty dose of current responsibilities thrown in. At a time when I see other recent PhDs getting at least something of a respite from the grueling schedule that got them through, I gave myself just the rest of the day after my defense. The next day, I went to interview to teach one course next semester. The day after that I had a guest lecture, and the two after that were my usual teaching days. Between these obligations, I have been marking student papers (I received 80-ish) so I can get them back in a timely fashion, started revising my dissertation for submission, and continued applying for jobs. I have barely had a chance to read fiction, which has been main concession to relaxation in the past few years.

This is terrible self-care on my part. I should rest. I need to rest if I am going to do the quality of work that might lead to future success. I know this, and yet I can’t help but feel that I can’t afford to take the time off.

My dissertation defense is in the past, but uncertainty is simultaneously putting a damper on my mood and contributing to the feeling that I am being pulled in multiple directions, which itself is making it difficult to move in any one of them.

The Muse of Lecture

Programming Note: I have been particularly busy of late so my reading has bogged down and substantive post-worthy thoughts are coming in fits and starts, so while there are some things in the works, things are going to remain irregular for the foreseeable future.

I have been thinking a lot about lectures recently because I have been tasked with a bunch of guest lectures, some scheduled, some emergency. Everyone has their own lecture styles, sometimes more than one depending on the type of class. Some are impressionistic, with good information, but refer students to sources where specific information is to be had. Some read from overloaded slides or march the audience through topic after topic, while others have detailed historiographical essays that they spin out as master storytellers.

Everyone needs to find their lecturing style and, ideally, have their muse. My lecturing style is still immature and improving, but I thought I should make mention of the person I consider my muse of the lecture.

Picture this. You arrive to class somewhat early and get out your notebook to make sure that you can write down the lists of names and terms being written up on the blackboard. Sometimes the professor is there writing down the terms, other times it is a TA, but, without fail, there is the list. Some terms, he says, are purely to help with spelling. Other than an occasional map, this will be the only aid and the extensive bank of terms doubles as the study guide for the exams. It is important that you arrive early and start writing because the lecture begins as soon as the class starts and you can’t risk missing anything that is said. When the period is over, your hand is cramped and it is entirely possible that you will need extra sheets of paper for your notebook before the semester is over, but you will have everything you need.

The scene should be familiar to most Brandeis history majors, at least if they took a course from Professor William Kapelle. Everyone has their Kapelle stories, and he certainly had plenty for you. I remember, for instance, a mini-diatribe about the international seafood market and one about the fine print of credit card offers. But then class began, sometimes with an apology if we had to have a lecture about agricultural changes in the Middle Ages, sometimes with no prologue. At one point while I was at Brandeis he began to write his lectures in exam bluebooks, declaring that it was the perfect length for a 50 minute class. In either case, he had a topic of the day and spun it out for a captivated audience of furiously-writing students. There were snarky asides, stories, and jokes, but the lectures were informative and detailed. I still have my lecture notes from his classes.

I owe a debt to Professor Kapelle, including for his willingness to write reference letters that got me into graduate school, but, the more I prepare to teach classes, the more I realize that it is his model that I start from in terms of how I want to lecture.

The Hearth and the Television

One of my favorite weeks when teaching US History since 1865 is when we get to discuss the 1950s and the American family. One of the exercises I have the students do is to analyze the Simpsons from the perspective that the eponymous family is a representation of the 1950s nuclear family. I ask the students leading questions in order to reach this point, dad (works), mom (stays home), two and a half kids, etc., etc., and one of the final issues we come to is what the show considers to be the central room of the house. There is often a bit of hesitation on this point until I ask how the credit sequence ends, to which there is an immediate chorus of “in front of the tv!”

This semester I gave a lecture on the topic of the ancient Greek family. Along with the delineation (and gendering) of space, one of the traditional talking points on this issue is that the household is defined by its hearth. This is borne out in myth with the representations of Hestia and the ideologically charged declarations in literature about the sacredness of the hearth. And yet the sources for burning in the archeological record vary and there is rarely unambiguous evidence for a stationary or permanent hearth. Similarly, lease agreements from Olynthus indicate that buildings were not disposed of as complete units, but individual rooms could be leased out for domestic use. I don’t find this revelation to be particularly surprising, but it is notable that some of the rooms allocated for domestic use show no evidence of a hearth. Thus the hearth that makes the home may be symbolic rather than actual.

I offer the television as the object that has this same ideological potency in the modern American household. One extreme example is illustrative. In the pilot of the AMC show Madmen, Don Draper taunts his mistress for having purchased a television despite her insistence that she didn’t need one, with the result that she throws the offending device out the window of her apartment in the Village. Draper is mollified by the exchange, but his return home at the end of the episode (as it is meant to) offers a striking contrast. Not only does he return to a house where there is a wife and kids, but they kids are watching TV and Draper settles in with them—-because a television is something that you have with your family, not with your mistress.

As an addendum, I still think even in our decentralized media environment there is something to the television holding symbolic weight as a place for family, whether that is an actual place in a household or something that can be alluded to in fiction. The range of portable devices on which one can watch the shows themselves signify something else, but the television as a place and object continue to carry this weight. In turn, the violation of this communal aesthetic, such as the image of a single person repeatedly watching shows heightens the sense of obsession, perversity, or trauma.