Blog

One Nation Under God

In their struggle against the New Deal, the business lobbies of the Depression era had allied themselves with conservative religious and cultural leaders and, in so doing, set in motion a new dynamic in American politics.

One of the things I like about teaching American history, and particularly twentieth century US history, is that it is fairly easy for students to see its relevance on contemporary society, which is a reliable way to turn up student engagement. One activity I like to do with students is to establish a broad premise, talk with the students to establish what preconceived ideas are floating around in the zeitgeist, and then work with them to understand how these ideas came from.

For instance, I do this with students when it comes to American religion in the twentieth century. I begin by asking them whether the United States is, broadly speaking, a religious country in general and a Christian country in particular. Some students will bring up the establishment clause in the Constitution, but eventually students say yes. I then ask how we know this, and, among a variety of answers, some student will inevitably point to “In God We Trust” printed on currency. I then work the students through some of the midcentury religious revivals and particularly the emergence of organized religion into the political sphere in the 1950s out of which public declarations of faith in the pledge of allegiance and US currency developed. My point with this activity isn’t to challenge anyone’s faith or even to explicitly reject the idea that most Americans in any given year considered themselves Christian, but rather to encourage students to see how, when and why these symbols came into being and therefore to think critically about what they mean.

I mention this example because I recently had a chance to read prominent #twitterstorian Kevin Kruse’s book One Nation Under God. The elevator pitch for this book is that Kruse goes looking for how the phrase “one nation under god” made its way into the pledge of allegiance of the 1950s. I was aware of the religious revivals in the 1950s and had always interpreted it as the realization of Cold War branding of the United States as distinct from “godless” communism, though, in retrospect, that was a lazy assumption.

Kruse traces the origin of these revivals and the first steps to bring religion from the realm of the personal to public life further back into the 1930s, when, he says, corporate leaders looked to religion to rehabilitate their brands from the stigma of the depression. In turn, and from a combination of personal piety and cynical self-interest, they helped sponsor events that sparked the 1950s revivals. The wave of religion encouraged and manipulated by President Eisenhower changed the nature of public religion in America and created an alliance between capitalism and christianity that dovetailed with American Cold War propaganda. In addition to the changes implemented to the pledge of allegiance and the face of currency, it was in this same period presidents began hosting the National Prayer Breakfast that has since become an annual event.

Where Americans once blanched at bringing the church and the state too close together because of the risk of corrupting the church, Kruse documents how in some of the early controversies over children reciting non-denominational prayers and the pledge of allegiance in schools, the ACLU was hesitant to take up the case on behalf of the parents.

Even though it took me longer to read than I would have liked (a combination of a busy schedule and a lot of detail meant that this was a slow read for me), I really liked On Nation Under God. I knew most of the broad outlines of this story, but the virtue of this book is that Kruse presents a mountain of evidence rather than relying as I was on general impressions. And within that evidence there are unexpected developments.

Two of my takeaways both came from his discussion of issues of religious faith in schools, which was taken to the Supreme Court.

One was the way in which the religion that made its way into public life was light on doctrine as a way to circumvent theological disputes and generate broad support. Nowhere was this more true than in the attempts to establish a non-denominational prayer to be recited daily in schools in New York. Critics thought its “vague theism” was so diluted as to be meaningless, but it strikes me that this pervasively felt, doctrinally ambivalent Christianity remains a legacy in American public life.

The other was an insight into the composition of the court in the 1950s and early 1960s when it passed down rulings on whether students should recite a prayer (no, it is not inherently patriotic) and the pledge of allegiance with the added language of “one nation under god” (yes, it is a declaration of patriotism, not a prayer). Kruse documents how some of the staunchest defenders of these decisions were themselves deeply religious and active in their churches, but that they believed that this was an unconstitutional act of establishing a religion.

As an outsider to both the field of American history and mainstream American Christianity, I am sure that there are facets of this book and its ramifications that I missed, but the broad strokes of this evolution in American political discourse was supremely enlightening for where they came from and thinking about how this relationship between business, religion, and government has developed in the decades since.

ΔΔΔ

I finished reading Drago Jančar’s I Saw Her Last Night, a fascinating Slovenian novel about the disappearance of a woman in the last years of World War 2, told through the memories of five people who knew her. I’m between books at the moment, but leaning toward next reading William Gibson’s Neuromancer.

My Own Devices

My usual way of being could probably be summed up as chronically un-hip. I usually read books, list to music and see movies well after that phase has passed. When culture swings back around to where I am, such as with the Song of Ice and Fire (which I started reading in about 2000 when I was in early high school), the hipness doesn’t quite stick. I generally have pretty good taste, in my obviously biased opinion, so this un-hipness doesn’t bother me. It just is.

This is all preamble to talking about a book that, in reading it less than a month after its publication, might possibly be the hippest thing I have ever done in my life. That book, published less than a month before I read it, is My Own Devices, a memoir by the Minneapolis hip-hop artist Dessa.

The essays in this collection consist of stories from and about Dessa’s early career as a touring artist that put friends, family, and challenges front and center. Each essay could stand on its own (and several were previously published), but the through line is her side of an extended, intermittent romantic relationship. Heartbreak became an addiction that defeated “time, distance, and whiskey”—what Dessa calls “over-the-counter remedies” that included moving to New York so that she wouldn’t be in the same town. The collection reaches its climax in the essay “Call off your Ghost,” which recounts her self-crafted experiments with fMRI-scanning and neurofeedback conditioning break this addiction.

Dessa writers beautifully, which is one of the reasons I like her music so much, and in fact there is a passage early on about her ex’ sage advice to rap more like she writes. Pulling back the curtain on these parts of her life put the songs into greater context, particularly for the early releases that aren’t quite as fully developed as in the more recent albums. But that would make this collection only of interest to fans of her music, when this is so much more. What I found particularly effective here is the self-portrait of a bright young woman who is simultaneously curious about the world, wrapped up in her neuroses, and ambitious to the point of grating against her lack of accomplishment.

I can’t do My Own Devices justice here. It is thoughtful meditation family, friends, and art, with a little less science than I was anticipating form the subtitle. (Science shows up in a couple of essays, generally as an adjunct to family or heartbreak.) Dessa is refreshingly blunt, acknowledging her imperfections even while telling her story in a sympathetic light. In short, I loved My Own Devices, going so far as to complain online that I started it at a time when I knew I would have to put it down, and am adding it to my list of Dessa’s work that I recommend to just about everyone I meet.

ΔΔΔ

I am now reading Kevin Kruse’ One Nation Under God, which argues that the public performance of religious piety in American life was invented in the 1930s by an alliance of corporate executives and religious leaders who opposed the New Deal and came to fruition during a 1950s post-war religious revival.

The Man Who Spoke Snakish

It really is ridiculous how persistently everything in my life has gone awry. It reminds me of a bird that builds itself a nest high in a tree, but at the same time as it sits down to hatch, the tree falls down. The bird flies to another tree, tries again, lays new eggs, broods on them, but the same day that the chicks hatch, a storm comes up and that tree, too, is cloven in two.

The end is at hand, and there’s no point in holding back on the good stuff. So what are you going to offer your guests?

Where to begin? Leemet, the narrator and protagonist is the last man who knows snakish, an ancient language that marks an ancient bond between humans and snakes and gives people control over most animals. Deer offer themselves to be eaten and wolves are tamed for milk and as steeds in time of war. Bears are more of a problem, though usually more because they are the lotharios of the forest more so than for their furiosity. The speakers of snakish live in the forest, in harmony with nature.

In previous generations they lost a war against the iron men who came from over the sea. Now the old ways are dying. People give up the forest to live in the village, show their butts to the sun while harvesting grain, and eat bread, which causes their tongues to become too clumsy to speak snakish. Leemet himself was born in town before his parents moved back to the forest before returning to claim his family inheritance. They are the exception and only a few traditionalists, including the last remaining Primates, remain. Among those are Tambet and his family. Tambet never forgave Leemet for having gone to the village and clings with ever greater desperation to what he sees as the old ways, but his daughter Hiie becomes one of Leemet’s playmates whenever she can escape her father’s wrath. Life in the forest is good for Leemet, but the days when speakers of snakish had venomous fangs, let alone the ability to summon the Frog of the North to repulse the iron men, are gone.

The Man Who Spoke Snakish spins the story of this vanishing world from after an inflection point has been passed. Leemet grows up in a world that is effectively dead. The result is a narrative that is at once a delightful coming of age story and a poignant examination of the nostalgia for lost tradition. The latter particularly emerges through through a number of characters who organize their lives around increasingly bizarre traditions. They claim that these traditions are ancient, whether brought from a far off land or simply how people used to live in Estonia, but what they are doing now is utterly unrecognizable from and usually unrelated to whatever seed they might have sprung from—something Leemet learns when he finally meets his grandfather…who lost his legs after a battle with the iron men and is now collecting bones from men he kills in order to construct a pair of wings.

I came to The Man Who Spoke Snakish purely because I wanted to read a book from a language I hadn’t before. I had never heard of Andrus Kivirähk, let alone read anything by him when I purchased this and a Slovenian novel after doing a bit of online research into “best novel” lists on the internet. I was not disappointed.

In a word, this book is spectacular. Much like a Miyazaki film, its whimsical prose belies that Kivirähk also captures something fundamental about the invention and destruction of tradition. The fact that the story is told as a folktale among a lower strata of society that is straining beneath the rule of the church and the knights is handled so deftly that it is almost invisible. Frequently these choices muted the impact of individual deaths, as though to show that it wasn’t the loss of the individual, but of the collective that is the real tragedy. The Man Who Spoke Snakish has its flaws, including that most of the characters are fun, but flat, but I found myself spirited away and loving every page.

ΔΔΔ

I recently finished Lorrie Moore’s collection Bark, which was well-crafted, but left me once again trying to figure out what it is about short stories that usually make them fall flat for me. I’m now reading Dessa’s fabulous new book My Own Devices.

Narratives Matter

An excerpt of a new book appearedin Salon this week, provocatively titled “Why Most Narrative History is Wrong. The book is similarly provocative, alleging in the subtitle to reveal “the neuroscience of our addiction to stories.” Naturally this caused a series of knee-jerk reactions that spawned long Twitter threads. I had a similarly impulsive response to the chapter, but also wanted to response to it in good faith before returning to a point the author and I actually agree on, that narratives—the stories we tell ourselves—are fundamental to human societies, because my distaste with this piece emerges from the consequences of this point.

below the jump

The View From Flyover Country: Dispatches from the Forgotten America

One cannot solve a problem until one acknowledges a problem exists.

People hate complaining because they do not like to list. When you listen to someone complaining, you are forced to acknowledge them as a human being instead of a category. You are forced to witness how social systems are borne out in personal experience, to recognize that hardship hurts, that solutions are not as simple as they seem.

Sarah Kendzior an expert on totalitarian regimes, particularly in central Asia, and a journalist based in St. Louis who I’ve followed on Twitter for some time. The View from Flyover Country is a collection of essays penned between 2012 and 2014 on issues that range from media to race to higher education. I read the entire collection in about three sittings last weekend, only setting it down when some of the essays hit a little too close to home.

The fact that The View From Flyover Country is a collection of essays published for Al Jazeera leads to a certain amount of repetition one would expect to find in a series of articles published on their own, but also offers scathing critiques of the present economic and social order in easily approachable chunks that cause her call to action to swell like a flood. Kendzior laces her criticism of the status quo with a deep humanism, making the case that the economic systems that have already shattered at least one generation and are hard at work on a second one deprive many Americans of not just economic opportunity, but basic dignity.

In the post-employment economy, is self-respect something we can afford? Or is another devalued commodity we are expected to give away?

The foundations of the system as Kendzior identifies it are rising inequality paired with increasingly expensive barriers to entry into lucrative careers that create pay-to-play environment. Simultaneously, she articulates that we are living in a post-employment economy in many sectors, where corporations aim to stay profitable by reducing wages and offloading costs onto the workers. These conditions, combined with the toxic potential of the new media landscape create totalitarian echoes.

Kendzior penned these essays well before the 2016 presidential election, but that campaign season and the events that have unfolded since have done nothing invalidate her words. If anything, the curtain was stripped back to reveal systemic and ideological weaknesses in the American system. Where people had previously brushed these off with wave toward a black president, long strides that have been made by women, or a general sense of American achievement—some of which is warranted—has been shown to also be gilding atop gross and growing inequality.

There are no easy solutions and Kendzior doesn’t pretend that there are. But to the extent that the first step to making things better is to acknowledge that a problem exists, The View From Flyover Country should be mandatory reading for everyone in the United States.

ΔΔΔ

I was under the weather this week, which managed to consume most of my energy left for reading, but I did start The Man Who Spoke Snakish, a fablistic novel by the Estonian author Andrus Kivirähk. It is too soon to judge the book, but I enjoyed the first few pages.

More than a numbers game

There is a labor crisis in higher education.

The myth of the well-compensated, insulated, and out of touch professor has a powerful grip on the American imagination, but in fact applies only to a few people, even among those lucky enough to have a tenured position. (The real money comparatively speaking is in administration, unless you happen to be a coach.) Most professors, including those on the tenure track, are not well-paid, particularly relative to their level of education. Setting that issue aside separate, albeit related issue, the larger crisis is that courses are increasingly being taught by adjunct professors with too little pay, no benefits, and no job security.

This is not new. The old line was that you should inquire how much of the teaching at a school is done by graduate instructors, and adjuncts are the latest iteration of the same forces that cause schools to fill credit hours with cheap labor.

In the sense that many, though not all, schools have bi-polar mission of teaching on the one side and world-leading research from their (full-time) faculty on the other, this split makes sense. As much as research influences teaching and vice-versa, both take time to do well. In the humanities, too, research generally doesn’t make money, but remains a benchmark for the university on various external rankings, which, in turn, is part of the pitch to bring in students. The solution is generally to bring in cheap labor to fulfill the teaching mandate, thereby creating a surplus that can be paid to the full-time faculty in the form of salary and research support, including travel and reduced teaching loads. Simple.

Only not so much. With state divestment from higher education, the financial burden for operating a university is frequently being passed on to the students, now branded as the consumers, in the form of tuition, fees, and, eventually solicitations for donations as alumni while they are still paying off loans for the initial investment. And at the same time, significant teaching loads are passed to underpaid and overworked contingent faculty. This is not to say that contingent faculty are bad teachers—-many are excellent—-but that while the cost to the student goes up the combination of financial precarity and insufficient resources impedes the ability of many of their teachers to help them reach their potential. Something like 75% of all faculty teaching in colleges are now non-tenure track positions, working under a range of titles and for a median pay of 2700 dollars per course.

These economic issues are fundamentally important to the future of higher education, a top-heavy system that feels many days like it is teetering precipitously. It is a matter of when, not if, something is going to give.

But that is not what prompted this post.

In response to a recent report on the issues surrounding contingent labor and a report that 79% of anthropology PhDs do not gain employment in tenure-track positions, I saw the inevitable response that the solution to this problem is to reduce production of PhDs. The idea is that this is a crisis created by supply far outstripping demand, which is true enough, but doesn’t acknowledge the underlying structures that are shaping demand.

The optimistic, if morbid, line even when I started graduate school in 2009 was that it was just a matter of waiting for the rapidly aging generations of professors to give up their posts one way or another. Not that the job market would be easy, but that there would be a wave of jobs that would make it easier. Before long it became apparent that the great recession of 2008, which struck right as I was graduating from college, marked an inflection point for higher education. Many of those older faculty members were clinging to their jobs not out of malice, selfishness, or obliviousness, but because they believed that their positions would not be replaced when they left. They were right. Their courses are taught by contingent faculty and the tenure lines largely boarded up and forgotten. This is the new normal.

These systemic changes are not unique to higher education, I should add. I’ve recently been reading Sarah Kendzior’s A View From Flyover Country where she talks at length about the seismic changes to the American economy after 2008 as companies looked for ways to remain profitable to stockholders. Universities are a little bit different because many schools are among the institutions most affected by government divestment, but there are many broad similarities.

Nevertheless, I am not in favor of a widespread slashing of graduate programs.

First, reducing the number of PhDs is not going to solve the labor crisis. There is already a long line of qualified candidate. In 2012, two schools, Harvard University and the University of Colorado received backlash after stating in the job ad that candidates more than a few years after graduation need not apply. Moreover, cutting positions in graduate programs does nothing to address the structural factors underlying the decline of tenured positions. In fact, cuts to graduate programs could conceivably accelerate the cuts to full-time positions because graduate programs are one of the justifications to keep tenured faculty.

Second, the remaining graduate programs would invariably exist in a handful of elite schools, which already produce most of the graduates who win the tenure-track job lottery. This list of elite schools is not immutable, but tends to favor those that already have large endowments. As is true elsewhere in American society, fluctuations to financial fortune tend to be much larger for schools without these inheritances.

In theory, limiting graduate education to wealthy schools would create a more ethical environment in terms of pay for graduate students, as well as provide them adequate research support, but it also develops scholars and teachers in an environment radically different from where most professors work—not to mention that their students will be coming from. Like with my comments about adjuncts above, this is not meant to denigrate people who go through elite institutions, many of whom are deeply concerned with issues of precarity, austerity and who do not come from privileged backgrounds. At the same time, reducing spots reduces the opportunity for people who are not already introduced to academic life, either during their undergraduate education or through individual mentor-ship, usually by someone with connections to those schools. Similarly, for as much scholarship comes out of people working in top-tier programs, they cannot cover everything. As in any number of fields, visibility and representation matter. A retreat toward the proverbial ivory tower reinforces the perception of a barrier between the intellectual elite and everyone else.

There are deep ethical issues with how graduate program in the humanities approach training, regardless of what the future of the professoriate looks like. There needs to be greater acknowledgement and preparation for so-called alt-ac jobs, and a support system in place to help people find employment with livable wages. That is, there is needs to be a reconsideration of the purpose of graduate school, with teaching in college being just one potential outcome.

(To be fair, this is easier said than done and I see programs coming to grips with this reality and beginning to implement changes, but too little and too slowly, and without enough action to counteract the emotional trauma of the current system.)

But there is also a larger point. People pursue advanced degrees for all sorts of reasons, including interest. This is a good thing. I may sound impossibly, naively idealistic, but I want to live in a society that supports and values education not out of a desire for credentialism but because these opportunities are where creative innovation is born. Eliminating graduate programs beyond those in well-funded schools makes sense if you look at the problems facing higher education as a simple supply-and-demand numbers game, but in fact threatens to realize some of the worst stereotypes about academia.

A Brave New World

There was something called liberalism. Parliament, if you know what that was, passed against [sleep teaching]. The records survive. Speeches about liberty of the subject. Liberty to be inefficient and miserable. Freedom to be a round peg in a square hole.

You’ve got to choose between happiness and what people used to call high art. We’ve sacrificed high art. We have the feelies and the scent organ instead.

Civilization has absolutely no need of nobility or heroism. These things are symptoms of political inefficiency. In a properly organized society like ours, nobody has any opportunities for being noble or heroic.

I had to read A Brave New World over the summer before my senior year of high school, the first book for AP English. I hated it, and it was from that experience that I developed my theory that I had a natural aversion to books I had to read. (My love of The Great Gatsby is the exception that proved the rule.) While some of the books read for high school still hold no appeal for me, this is one I’ve been meaning in to re-read for some time now. As with Fahrenheit 451, 2018 seemed like an appropriate year to work through some of these classic dystopian stories.

The brave new world in this book is a perfectly stable global utopia achieved through artificial reproduction, genetic manipulation to create a clear caste hierarchy that descend from “alpha double plus” through “epsilon”, and conditioning to ensure the each person not only accepts their place in society, but embraces it as ideal. Free love is mandatory as a way to prevent jealousy and possessiveness, and everyone is regularly treated with powerful emotional stimulation and, more importantly, with doses of soma, a drug distributed by the state. Doped up by pleasure, people abandon interest in anything else.

There are drips and drabs of how this utopia that worships Henry Ford came into existence, a compromise after a series of destructive wars in the distant past. Despite genetic engineering, the world is not even. Places deemed too inhospitable are left as “Savage Reservations” and islands like Iceland and the Falklands, far from the Metropole, are the preferred landing place for people with mildly heretical ideas.

A Brave New World follows two arcs, tied together by the mildly unorthodox alpha, Bernard Marx. In the first arc, Bernard sets a date with the “pneumatic” Lenina Crowne. Lenina is herself under scrutiny for becoming too attached to her current partner, and so she sets to date the uncommonly short and aloof, particularly for an alpha. The arc concludes with the pair going on a vacation to the Savage Reservation in New Mexico. The second arc carries forward their return from New Mexico, taking with them a dark secret from another vacation taken decades earlier: a woman who had been left behind and the child she bore–not entirely by choice–against all strictures of society.

The narrative tension of A Brave New World largely centers on the fate of John, “the Savage,” and his choice between submitting to the constraints of a society that would provide his every pleasure and the pain of freedom. (In his forward to the volume I read, Huxley wrote that if he were to write the book over again, he would include a third option.) I appreciate Huxley’s social commentary more now than I did in high school. This new world is one of abject consumerism were it is verboten to repair an item when you could just replace it and maximum pleasure is the highest calling. Possessiveness breeds jealousy, pain breeds strife, and independent thought leads to both. Thus the central authority maintains its power by tamping down those instincts.

And yet, I found the characters rather flat and the plot thin such that it becomes reduced to a deterministic parable about freedom and happiness.

The larger question I had going into this book, though, was how it stacked up against Fahrenheit 451 and 1984. On the one side, A Brave New World shares with Bradbury’s dystopia an emphasis on pleasure and freedom from heretical thought, but the latter suggests communal enforcement. On the other, it shares totalitarianism with 1984, albeit one of a consumerist make.

1984 receives too little appreciation because it was assumed that it could never happen here where society is governed by liberal political institutions. (Note: this judgement may be undergoing revision in light of recent events.) Where the state in 1984 exploits difference, the one in A Brave New World has a single world state that erases them in any meaningful way other than caste, but then conditions each caste to appreciate its position in society—and then only see the world from the perspective of people in the top two classes. This is a world that doesn’t have to address the consequences of unapologetic waste and that has no enemies outside certain tendencies in human nature. In short, A Brave New World is a dystopia for a happier time.

ΔΔΔ

The semester is in full swing, but I’m still carving small slivers of time to read. I finished Quesadillas by Juan Pablo Villalobos, a slim, irreverent novel about a poor family in small town Mexico with middle class delusions, and started reading Sarah Kendzior’s collection The View from Flyover Country.

A Dead End Lane

Τhere is a “dead end” sign where I don’t remember there being one before. The road just ended. In practice, that is. Officially, the dead end is where the maintained road turns into a long-since overgrown class-4 road still drawn old maps.

Returning to the rural hilltop where I grew up always makes me think. The sodden smell of decaying leaves is comforting, even when accompanied by the dull whine and sharp bite of hundreds of insects. But for all the familiarity of the dirt roads turned into tunnels by fifty shades of green, there are subtle changes. There are more signs, for one thing. The roads have names and the turnaround for the school bus is labelled with a warning to anyone who would think to park there. There are also more houses. Big houses, brought visibly close to the road, in place of the small, frequently ramshackle habitats sitting in clearings carved from the forest.

Below the hills, other parts of town are the same way. The elementary school is still there, its doors open with just under fifty students, but so is the old general store building that has simply decayed since it was shuttered close to twenty years ago. The radar traps are new, flashing a warning to drivers who ignore the speed limit through a village that seems more than a little irritated at being ignored, but unable to do much about it.

People ask me about where I come from whenever I return from these trips. Vermont is a curiosity to them, an edenic wilderness that defies the modern world or a bastion of progressive politics epitomized by a frazzled-looking white man with a thick Brooklyn accent. (When Sanders first moved to Vermont in the 1960s it was to Stannard, a nearby town where several of my high school friends lived.) It is, after all, the birthplace of Phish and home to Bread and Puppet.

Vermont has certainly earned these reputation in recent decades, with a left-leaning congressional delegation and the early recognition of homosexual partnerships. My general read on these things is that politics in Vermont that there is a strong libertarian streak and that the intimate nature of politics in such a small state helped get the delegation repeatedly re-elected more so than their voting record, with the state historically having been a bastion of Republican politics. (Between Civil War and 1988, Vermont’s electoral votes went to a Democrat once, in 1964.) Its reputation, moreover, ignores the virulent backlash against the civil union law that went into effect in 2000, the so-called “Take Back Vermont” movement—not to mention an ugly history of bigotry that includes a small but virulent anti-Catholic strain of the KKK in the 1920s. More recently, when students proposed that Vermont adopt a Latin motto there was outcry from people who believed that “Latin” meant “Latin American”. Their mistake speaks volumes both about the makeup of the population and some of the limits to the education system, despite generally positive rankings.

These are young forests. The foundations of farmhouses and lines of stone walls are common sights when walking in the woods, serving as a reminder that the state was largely deforested in the 1800s. In some ways things haven’t changed much. From the right vantage point, the granite quarries still stick out as scars against the wooded hills and agriculture remains a significant part of the economy, even as forests have reclaimed the fields.

In truth, these things work hand in hand. Vermont’s isolation and economic challenges, particularly in the corner where I grew up, lead to poverty, but also make it an attractive destination for artists and back-to-the-earth types. The result is a population that is in flux, with a percentage of the population having been born in-state below the national average.

I haven’t lived in Vermont for more than a few months in a year since starting college in 2004 and haven’t lived there at all in a decade. I can’t remember the last time I talked to an elementary school classmate, but receive periodic updates. Some are doing well, but I more frequently hear about the ones who have struggled with drugs and the law. One died earlier this year. (I do better with people who weren’t in my specific class, as well as people from high school.) Time passes, places and people change as variations on a theme. I would like to move back to Vermont, should the opportunity present itself, but that seems like a remote possibility right now. At the same time, growing up in a rural town that had its largest population in the 1840 census informs what I do as a historian and teacher.

Writing this from my couch in Columbia, Missouri, I fear that I lost my thread. I wrote the opening sentences of this post on my phone from that wooded hilltop where I had no cell reception. All I had were a few lines, a couple of observations about the dead-end dirt road I grew up on, then and now, and a sense of omen that I couldn’t quite put my finger on about a sign. I still don’t know the conclusion, except that a launch pad is a dead end of another form.

The Plot Against America – Philip Roth

For most of my life Philip Roth’s novels have existed in an environment just beyond my radar. I knew about them in a general sense and was aware that he was held in high esteem as a literary author, but that is as far as it went. Then he died. After several podcasts I listen to did retrospectives of his career I decided I should change that.

The Plot Against America, Roth’s 2004 novel, is a grim alternate history that explores the issue of antisemitism in America.

The story takes place in the narrator’s (young Philip Roth) youth in Newark when Charles Lindbergh makes a surprise appearance at a deadlocked 1940 Republican National Convention and sweeps his way to the nomination. Lindbergh’s campaign frames the choice as between Roosevelt’s warmongering and American First, as he hops from city to city in his personal plane giving speeches on the airfield. Roosevelt, by contrast, is old-fashioned and traditional. Failing to appreciate the threat posed by Lindbergh, Roosevelt loses the election and retires from public life to his estate in New York.

For Roth’s Jewish family, the election is a disaster. Around every corner are people with anti-semitic opinions now empowered by the president and America-Firsters who regard Roosevelt’s globalist supporters as traitors. With the US committed to non-intervention, Philip’s cousin Alvin runs away from home to join the Canadian army to fight Hitler. Roth’s father begins listening exclusively to the left-wing demagogic radio personality Walt Winchell who loudly denounces Lindbergh as a fascist. Every action taken by the government is tinged with bigotry, he believes, the first step toward a pogrom.

The “Just Folks” program sends Jewish youths from urban areas to farms in the heartland. Philip’s older brother Sandy ends up in Kentucky for a summer working on a tobacco farm and returns a convert to the mission of the OAA—the Organization of American Absorption. Then Alvin returns, having lost a leg in combat. Further exacerbating tensions in the family is that Philip’s aunt Evelyn goes to work for Rabbi Lionel Bengelsdorf, the head of the OAA office in New Jersey.

The Plot Against America is presented as a retrospective of a dark episode in American history that both reveals a psychic scar in the country’s collective conscience and ends as abruptly as it began. Roth’s youth during the events described and the nature of conspiracy leaves it unclear what happened to bring Lindbergh to office, let alone what happened while he was there that leads to a bloody climax.

The national and historical developments create the backdrop for what is, ultimately, a family drama. The Lindbergh administration works to break up Jewish enclaves in cities like Newark, and the Roth family is split between those who hold to their convictions, such as his father, those who want to ignore politics, and the collaborators, whether out of naked opportunism or youthful naivete. The characters are vividly drawn, frequently in the graphic detail and sharp colors of youthful memory. There are good gentiles in The Plot Against America, much as there are bad Jews. In both cases Roth captures something fundamental to and fundamentally fragile in the soul of America.

Although it was published in 2004, The Plot Against America was an eerie read for 2018, right down to a Scandinavian summit where an American president with a fervent base is openly condemned for fawning behavior toward another foreign leader, leading commentators to ask what that leader has on the President. Similarly, American prejudices are papered over by a tradition of constitutionalism, but only barely, and there is a preference for collective amnesia rather than for resolution.

The Plot Against America> was hard to read, but rather than being a book that lost its edge since its publication, it is one that has only become sharper. That is probably too lofty a standard to set for when I get to Roth’s other books, but I can now say with certainty that I am going to be reading more.

ΔΔΔ

Next up, I just started reading A Brave New World. I read it in high school but remember nothing except a general sense of distaste. Like with Fahrenheit 451, I want to give it a fair shake.

First day fragments

My fall semester begins in earnest today, with the first session for both of my classes. I don’t have a single back-to-school post idea, but rather a bunch of loosely connected ones, so decided to go with a fragmentary format.

“I didn’t get everything done” is a standard lament for academics come late August, bemoaning some combination of the cult of productivity, human limitations, and the difficulties of researching during the school year. I am no exception. I set an ambitious schedule for reading scholarship beyond my immediate research, but only managed to read a handful of books and articles, and a couple of books on teaching.

There are a couple of explanations for this failure. One is that the summer quickly became very busy, with multiple family trips that had less down-time than anticipated, meaning that there was neither opportunity for reading nor for a deep recharge of my batteries. Another was that I taught an intensive summer World History course in June, so much of my spare reading went toward preparing for class. A third was that seemingly every spare moment around these time commitments was sucked up by working on revising my dissertation as a book. My goal for that was to have it under review by the start of class, but I missed that deadline, too. At least I am in a position to meet my revised goal of August 31 for that one…

ΔΔΔ

There has been a movement in recent years to normalize failure, particularly in academia, leading to people sharing their failures on Twitter over the last week. I mentioned there that I respect the movement, and appreciate the baseball analogy where if you’re a batter and only “fail” (make an out) at the plate six out of every ten times, you belong in the hall of fame. (There are obviously other statistics from baseball that could make that more or less extreme. If you’re a pitcher and batters swing and miss just 20% of the time, you’re incredible, but if that is the percentage of the time you throw strikes, then you probably quit playing in little league.) I respect the impulse to normalize failure because it is inevitably going to happen, regardless of how generous and kind the academy becomes. Everyone is going to experience article/grant/abstract/job/proposal rejections for a host of reasons. Sometimes those reasons are good (the project needs more work), sometimes they are petty, and a lot of the time is a simple numbers game that has almost nothing to do with what was proposed.

My shadow CV includes all of these things, including four article rejections, two more revise and resubmits that were later accepted, at least seven paper abstracts rejected that I can think of off hand, too many funding applications for fellowships and travel grants to count them all. And I am only a little more than a year removed from graduating with my PhD.

At the same time, I found the push to normalize, share, and celebrate failure on social media hard to handle. The main reason is that while failure is normal in the academy, and rejections can be handled deftly with an eye toward improving the project for the next time around, it is also a sign of privilege to be able to reflect on this Shadow CV. It is coming from someone still “in the game”, as it were, and I heard with every round of shares “this is what you *should* have been applying for.” As in, your failures themselves are inadequate because the “stars” fail bigger and better.

Then pair this with the part I left out of my Shadow CV that are the all jobs I’ve applied to without making the long list. The Shadow CV is meant to normalize failure so that people can better overcome the natural fear of it and thereby reduce anxiety, but when mixed with too few academic jobs to go around and the sheer amount of time that applying for them takes, it just exacerbated mine.

ΔΔΔ

I’m looking forward to teaching both of my classes this semester. One I am teaching my own syllabus for the second time, the other I am teaching as the sole instructor for the first time. I had the chance to teach on my own a little bit during graduate school, but this is my second year of continuously teaching my own courses and reading up on pedagogy, so I am now to synthesize some principles for my classroom.

First Principle: Learning, not grades. I do not care about grades beyond making sure that I have created a reasonable and achievable grade scale for the class. My goal as a teacher is to help students develop practical skills such as writing and the ability to understand the world through critical analysis and synthesizing information. Toward that end, I believe that many common assessment tools that are built for scale are next to useless in actually assessing learning. I design my classes around assignments that require students to develop arguments through writing and that build on each other so that students can show improvement in tasks that are not easy.

Second Principle: Empathy. Students are adults who have a larger number of demands on them than even I did when entering school fifteen years ago. I aspire to treat them like adults with responsibilities, just one of which is my class. College is “the real world” where students are on their own for the first time, and I want to be a mentor/coach/guide. This means having empathy, and encouraging them to take ownership of their education by talking with me when they have a conflict or need help.

Third Principle: Engagement. “Meaningful learning experiences” is a hot topic, though my mother assures me that this has been the key phrase for many decades now. Every class is going to be selective in the material it covers, so I see my job being to give students the tools to learn more and to pique their curiosity to want to do so. This means developing activities and assignments that require engagement, through games, debates, and projects where students take ownership of the material. This has not been the easiest task for me as someone who found history books thrilling in high school, but something that I am committed to improving in my own teaching.

There are others, but these are my first three.

ΔΔΔ

Without further ado, let the semester begin!