The Man Who Spoke Snakish

It really is ridiculous how persistently everything in my life has gone awry. It reminds me of a bird that builds itself a nest high in a tree, but at the same time as it sits down to hatch, the tree falls down. The bird flies to another tree, tries again, lays new eggs, broods on them, but the same day that the chicks hatch, a storm comes up and that tree, too, is cloven in two.

The end is at hand, and there’s no point in holding back on the good stuff. So what are you going to offer your guests?

Where to begin? Leemet, the narrator and protagonist is the last man who knows snakish, an ancient language that marks an ancient bond between humans and snakes and gives people control over most animals. Deer offer themselves to be eaten and wolves are tamed for milk and as steeds in time of war. Bears are more of a problem, though usually more because they are the lotharios of the forest more so than for their furiosity. The speakers of snakish live in the forest, in harmony with nature.

In previous generations they lost a war against the iron men who came from over the sea. Now the old ways are dying. People give up the forest to live in the village, show their butts to the sun while harvesting grain, and eat bread, which causes their tongues to become too clumsy to speak snakish. Leemet himself was born in town before his parents moved back to the forest before returning to claim his family inheritance. They are the exception and only a few traditionalists, including the last remaining Primates, remain. Among those are Tambet and his family. Tambet never forgave Leemet for having gone to the village and clings with ever greater desperation to what he sees as the old ways, but his daughter Hiie becomes one of Leemet’s playmates whenever she can escape her father’s wrath. Life in the forest is good for Leemet, but the days when speakers of snakish had venomous fangs, let alone the ability to summon the Frog of the North to repulse the iron men, are gone.

The Man Who Spoke Snakish spins the story of this vanishing world from after an inflection point has been passed. Leemet grows up in a world that is effectively dead. The result is a narrative that is at once a delightful coming of age story and a poignant examination of the nostalgia for lost tradition. The latter particularly emerges through through a number of characters who organize their lives around increasingly bizarre traditions. They claim that these traditions are ancient, whether brought from a far off land or simply how people used to live in Estonia, but what they are doing now is utterly unrecognizable from and usually unrelated to whatever seed they might have sprung from—something Leemet learns when he finally meets his grandfather…who lost his legs after a battle with the iron men and is now collecting bones from men he kills in order to construct a pair of wings.

I came to The Man Who Spoke Snakish purely because I wanted to read a book from a language I hadn’t before. I had never heard of Andrus Kivirähk, let alone read anything by him when I purchased this and a Slovenian novel after doing a bit of online research into “best novel” lists on the internet. I was not disappointed.

In a word, this book is spectacular. Much like a Miyazaki film, its whimsical prose belies that Kivirähk also captures something fundamental about the invention and destruction of tradition. The fact that the story is told as a folktale among a lower strata of society that is straining beneath the rule of the church and the knights is handled so deftly that it is almost invisible. Frequently these choices muted the impact of individual deaths, as though to show that it wasn’t the loss of the individual, but of the collective that is the real tragedy. The Man Who Spoke Snakish has its flaws, including that most of the characters are fun, but flat, but I found myself spirited away and loving every page.

ΔΔΔ

I recently finished Lorrie Moore’s collection Bark, which was well-crafted, but left me once again trying to figure out what it is about short stories that usually make them fall flat for me. I’m now reading Dessa’s fabulous new book My Own Devices.

Narratives Matter

An excerpt of a new book appearedin Salon this week, provocatively titled “Why Most Narrative History is Wrong. The book is similarly provocative, alleging in the subtitle to reveal “the neuroscience of our addiction to stories.” Naturally this caused a series of knee-jerk reactions that spawned long Twitter threads. I had a similarly impulsive response to the chapter, but also wanted to response to it in good faith before returning to a point the author and I actually agree on, that narratives—the stories we tell ourselves—are fundamental to human societies, because my distaste with this piece emerges from the consequences of this point.

below the jump

The View From Flyover Country: Dispatches from the Forgotten America

One cannot solve a problem until one acknowledges a problem exists.

People hate complaining because they do not like to list. When you listen to someone complaining, you are forced to acknowledge them as a human being instead of a category. You are forced to witness how social systems are borne out in personal experience, to recognize that hardship hurts, that solutions are not as simple as they seem.

Sarah Kendzior an expert on totalitarian regimes, particularly in central Asia, and a journalist based in St. Louis who I’ve followed on Twitter for some time. The View from Flyover Country is a collection of essays penned between 2012 and 2014 on issues that range from media to race to higher education. I read the entire collection in about three sittings last weekend, only setting it down when some of the essays hit a little too close to home.

The fact that The View From Flyover Country is a collection of essays published for Al Jazeera leads to a certain amount of repetition one would expect to find in a series of articles published on their own, but also offers scathing critiques of the present economic and social order in easily approachable chunks that cause her call to action to swell like a flood. Kendzior laces her criticism of the status quo with a deep humanism, making the case that the economic systems that have already shattered at least one generation and are hard at work on a second one deprive many Americans of not just economic opportunity, but basic dignity.

In the post-employment economy, is self-respect something we can afford? Or is another devalued commodity we are expected to give away?

The foundations of the system as Kendzior identifies it are rising inequality paired with increasingly expensive barriers to entry into lucrative careers that create pay-to-play environment. Simultaneously, she articulates that we are living in a post-employment economy in many sectors, where corporations aim to stay profitable by reducing wages and offloading costs onto the workers. These conditions, combined with the toxic potential of the new media landscape create totalitarian echoes.

Kendzior penned these essays well before the 2016 presidential election, but that campaign season and the events that have unfolded since have done nothing invalidate her words. If anything, the curtain was stripped back to reveal systemic and ideological weaknesses in the American system. Where people had previously brushed these off with wave toward a black president, long strides that have been made by women, or a general sense of American achievement—some of which is warranted—has been shown to also be gilding atop gross and growing inequality.

There are no easy solutions and Kendzior doesn’t pretend that there are. But to the extent that the first step to making things better is to acknowledge that a problem exists, The View From Flyover Country should be mandatory reading for everyone in the United States.

ΔΔΔ

I was under the weather this week, which managed to consume most of my energy left for reading, but I did start The Man Who Spoke Snakish, a fablistic novel by the Estonian author Andrus Kivirähk. It is too soon to judge the book, but I enjoyed the first few pages.

More than a numbers game

There is a labor crisis in higher education.

The myth of the well-compensated, insulated, and out of touch professor has a powerful grip on the American imagination, but in fact applies only to a few people, even among those lucky enough to have a tenured position. (The real money comparatively speaking is in administration, unless you happen to be a coach.) Most professors, including those on the tenure track, are not well-paid, particularly relative to their level of education. Setting that issue aside separate, albeit related issue, the larger crisis is that courses are increasingly being taught by adjunct professors with too little pay, no benefits, and no job security.

This is not new. The old line was that you should inquire how much of the teaching at a school is done by graduate instructors, and adjuncts are the latest iteration of the same forces that cause schools to fill credit hours with cheap labor.

In the sense that many, though not all, schools have bi-polar mission of teaching on the one side and world-leading research from their (full-time) faculty on the other, this split makes sense. As much as research influences teaching and vice-versa, both take time to do well. In the humanities, too, research generally doesn’t make money, but remains a benchmark for the university on various external rankings, which, in turn, is part of the pitch to bring in students. The solution is generally to bring in cheap labor to fulfill the teaching mandate, thereby creating a surplus that can be paid to the full-time faculty in the form of salary and research support, including travel and reduced teaching loads. Simple.

Only not so much. With state divestment from higher education, the financial burden for operating a university is frequently being passed on to the students, now branded as the consumers, in the form of tuition, fees, and, eventually solicitations for donations as alumni while they are still paying off loans for the initial investment. And at the same time, significant teaching loads are passed to underpaid and overworked contingent faculty. This is not to say that contingent faculty are bad teachers—-many are excellent—-but that while the cost to the student goes up the combination of financial precarity and insufficient resources impedes the ability of many of their teachers to help them reach their potential. Something like 75% of all faculty teaching in colleges are now non-tenure track positions, working under a range of titles and for a median pay of 2700 dollars per course.

These economic issues are fundamentally important to the future of higher education, a top-heavy system that feels many days like it is teetering precipitously. It is a matter of when, not if, something is going to give.

But that is not what prompted this post.

In response to a recent report on the issues surrounding contingent labor and a report that 79% of anthropology PhDs do not gain employment in tenure-track positions, I saw the inevitable response that the solution to this problem is to reduce production of PhDs. The idea is that this is a crisis created by supply far outstripping demand, which is true enough, but doesn’t acknowledge the underlying structures that are shaping demand.

The optimistic, if morbid, line even when I started graduate school in 2009 was that it was just a matter of waiting for the rapidly aging generations of professors to give up their posts one way or another. Not that the job market would be easy, but that there would be a wave of jobs that would make it easier. Before long it became apparent that the great recession of 2008, which struck right as I was graduating from college, marked an inflection point for higher education. Many of those older faculty members were clinging to their jobs not out of malice, selfishness, or obliviousness, but because they believed that their positions would not be replaced when they left. They were right. Their courses are taught by contingent faculty and the tenure lines largely boarded up and forgotten. This is the new normal.

These systemic changes are not unique to higher education, I should add. I’ve recently been reading Sarah Kendzior’s A View From Flyover Country where she talks at length about the seismic changes to the American economy after 2008 as companies looked for ways to remain profitable to stockholders. Universities are a little bit different because many schools are among the institutions most affected by government divestment, but there are many broad similarities.

Nevertheless, I am not in favor of a widespread slashing of graduate programs.

First, reducing the number of PhDs is not going to solve the labor crisis. There is already a long line of qualified candidate. In 2012, two schools, Harvard University and the University of Colorado received backlash after stating in the job ad that candidates more than a few years after graduation need not apply. Moreover, cutting positions in graduate programs does nothing to address the structural factors underlying the decline of tenured positions. In fact, cuts to graduate programs could conceivably accelerate the cuts to full-time positions because graduate programs are one of the justifications to keep tenured faculty.

Second, the remaining graduate programs would invariably exist in a handful of elite schools, which already produce most of the graduates who win the tenure-track job lottery. This list of elite schools is not immutable, but tends to favor those that already have large endowments. As is true elsewhere in American society, fluctuations to financial fortune tend to be much larger for schools without these inheritances.

In theory, limiting graduate education to wealthy schools would create a more ethical environment in terms of pay for graduate students, as well as provide them adequate research support, but it also develops scholars and teachers in an environment radically different from where most professors work—not to mention that their students will be coming from. Like with my comments about adjuncts above, this is not meant to denigrate people who go through elite institutions, many of whom are deeply concerned with issues of precarity, austerity and who do not come from privileged backgrounds. At the same time, reducing spots reduces the opportunity for people who are not already introduced to academic life, either during their undergraduate education or through individual mentor-ship, usually by someone with connections to those schools. Similarly, for as much scholarship comes out of people working in top-tier programs, they cannot cover everything. As in any number of fields, visibility and representation matter. A retreat toward the proverbial ivory tower reinforces the perception of a barrier between the intellectual elite and everyone else.

There are deep ethical issues with how graduate program in the humanities approach training, regardless of what the future of the professoriate looks like. There needs to be greater acknowledgement and preparation for so-called alt-ac jobs, and a support system in place to help people find employment with livable wages. That is, there is needs to be a reconsideration of the purpose of graduate school, with teaching in college being just one potential outcome.

(To be fair, this is easier said than done and I see programs coming to grips with this reality and beginning to implement changes, but too little and too slowly, and without enough action to counteract the emotional trauma of the current system.)

But there is also a larger point. People pursue advanced degrees for all sorts of reasons, including interest. This is a good thing. I may sound impossibly, naively idealistic, but I want to live in a society that supports and values education not out of a desire for credentialism but because these opportunities are where creative innovation is born. Eliminating graduate programs beyond those in well-funded schools makes sense if you look at the problems facing higher education as a simple supply-and-demand numbers game, but in fact threatens to realize some of the worst stereotypes about academia.

A Brave New World

There was something called liberalism. Parliament, if you know what that was, passed against [sleep teaching]. The records survive. Speeches about liberty of the subject. Liberty to be inefficient and miserable. Freedom to be a round peg in a square hole.

You’ve got to choose between happiness and what people used to call high art. We’ve sacrificed high art. We have the feelies and the scent organ instead.

Civilization has absolutely no need of nobility or heroism. These things are symptoms of political inefficiency. In a properly organized society like ours, nobody has any opportunities for being noble or heroic.

I had to read A Brave New World over the summer before my senior year of high school, the first book for AP English. I hated it, and it was from that experience that I developed my theory that I had a natural aversion to books I had to read. (My love of The Great Gatsby is the exception that proved the rule.) While some of the books read for high school still hold no appeal for me, this is one I’ve been meaning in to re-read for some time now. As with Fahrenheit 451, 2018 seemed like an appropriate year to work through some of these classic dystopian stories.

The brave new world in this book is a perfectly stable global utopia achieved through artificial reproduction, genetic manipulation to create a clear caste hierarchy that descend from “alpha double plus” through “epsilon”, and conditioning to ensure the each person not only accepts their place in society, but embraces it as ideal. Free love is mandatory as a way to prevent jealousy and possessiveness, and everyone is regularly treated with powerful emotional stimulation and, more importantly, with doses of soma, a drug distributed by the state. Doped up by pleasure, people abandon interest in anything else.

There are drips and drabs of how this utopia that worships Henry Ford came into existence, a compromise after a series of destructive wars in the distant past. Despite genetic engineering, the world is not even. Places deemed too inhospitable are left as “Savage Reservations” and islands like Iceland and the Falklands, far from the Metropole, are the preferred landing place for people with mildly heretical ideas.

A Brave New World follows two arcs, tied together by the mildly unorthodox alpha, Bernard Marx. In the first arc, Bernard sets a date with the “pneumatic” Lenina Crowne. Lenina is herself under scrutiny for becoming too attached to her current partner, and so she sets to date the uncommonly short and aloof, particularly for an alpha. The arc concludes with the pair going on a vacation to the Savage Reservation in New Mexico. The second arc carries forward their return from New Mexico, taking with them a dark secret from another vacation taken decades earlier: a woman who had been left behind and the child she bore–not entirely by choice–against all strictures of society.

The narrative tension of A Brave New World largely centers on the fate of John, “the Savage,” and his choice between submitting to the constraints of a society that would provide his every pleasure and the pain of freedom. (In his forward to the volume I read, Huxley wrote that if he were to write the book over again, he would include a third option.) I appreciate Huxley’s social commentary more now than I did in high school. This new world is one of abject consumerism were it is verboten to repair an item when you could just replace it and maximum pleasure is the highest calling. Possessiveness breeds jealousy, pain breeds strife, and independent thought leads to both. Thus the central authority maintains its power by tamping down those instincts.

And yet, I found the characters rather flat and the plot thin such that it becomes reduced to a deterministic parable about freedom and happiness.

The larger question I had going into this book, though, was how it stacked up against Fahrenheit 451 and 1984. On the one side, A Brave New World shares with Bradbury’s dystopia an emphasis on pleasure and freedom from heretical thought, but the latter suggests communal enforcement. On the other, it shares totalitarianism with 1984, albeit one of a consumerist make.

1984 receives too little appreciation because it was assumed that it could never happen here where society is governed by liberal political institutions. (Note: this judgement may be undergoing revision in light of recent events.) Where the state in 1984 exploits difference, the one in A Brave New World has a single world state that erases them in any meaningful way other than caste, but then conditions each caste to appreciate its position in society—and then only see the world from the perspective of people in the top two classes. This is a world that doesn’t have to address the consequences of unapologetic waste and that has no enemies outside certain tendencies in human nature. In short, A Brave New World is a dystopia for a happier time.

ΔΔΔ

The semester is in full swing, but I’m still carving small slivers of time to read. I finished Quesadillas by Juan Pablo Villalobos, a slim, irreverent novel about a poor family in small town Mexico with middle class delusions, and started reading Sarah Kendzior’s collection The View from Flyover Country.

A Dead End Lane

Τhere is a “dead end” sign where I don’t remember there being one before. The road just ended. In practice, that is. Officially, the dead end is where the maintained road turns into a long-since overgrown class-4 road still drawn old maps.

Returning to the rural hilltop where I grew up always makes me think. The sodden smell of decaying leaves is comforting, even when accompanied by the dull whine and sharp bite of hundreds of insects. But for all the familiarity of the dirt roads turned into tunnels by fifty shades of green, there are subtle changes. There are more signs, for one thing. The roads have names and the turnaround for the school bus is labelled with a warning to anyone who would think to park there. There are also more houses. Big houses, brought visibly close to the road, in place of the small, frequently ramshackle habitats sitting in clearings carved from the forest.

Below the hills, other parts of town are the same way. The elementary school is still there, its doors open with just under fifty students, but so is the old general store building that has simply decayed since it was shuttered close to twenty years ago. The radar traps are new, flashing a warning to drivers who ignore the speed limit through a village that seems more than a little irritated at being ignored, but unable to do much about it.

People ask me about where I come from whenever I return from these trips. Vermont is a curiosity to them, an edenic wilderness that defies the modern world or a bastion of progressive politics epitomized by a frazzled-looking white man with a thick Brooklyn accent. (When Sanders first moved to Vermont in the 1960s it was to Stannard, a nearby town where several of my high school friends lived.) It is, after all, the birthplace of Phish and home to Bread and Puppet.

Vermont has certainly earned these reputation in recent decades, with a left-leaning congressional delegation and the early recognition of homosexual partnerships. My general read on these things is that politics in Vermont that there is a strong libertarian streak and that the intimate nature of politics in such a small state helped get the delegation repeatedly re-elected more so than their voting record, with the state historically having been a bastion of Republican politics. (Between Civil War and 1988, Vermont’s electoral votes went to a Democrat once, in 1964.) Its reputation, moreover, ignores the virulent backlash against the civil union law that went into effect in 2000, the so-called “Take Back Vermont” movement—not to mention an ugly history of bigotry that includes a small but virulent anti-Catholic strain of the KKK in the 1920s. More recently, when students proposed that Vermont adopt a Latin motto there was outcry from people who believed that “Latin” meant “Latin American”. Their mistake speaks volumes both about the makeup of the population and some of the limits to the education system, despite generally positive rankings.

These are young forests. The foundations of farmhouses and lines of stone walls are common sights when walking in the woods, serving as a reminder that the state was largely deforested in the 1800s. In some ways things haven’t changed much. From the right vantage point, the granite quarries still stick out as scars against the wooded hills and agriculture remains a significant part of the economy, even as forests have reclaimed the fields.

In truth, these things work hand in hand. Vermont’s isolation and economic challenges, particularly in the corner where I grew up, lead to poverty, but also make it an attractive destination for artists and back-to-the-earth types. The result is a population that is in flux, with a percentage of the population having been born in-state below the national average.

I haven’t lived in Vermont for more than a few months in a year since starting college in 2004 and haven’t lived there at all in a decade. I can’t remember the last time I talked to an elementary school classmate, but receive periodic updates. Some are doing well, but I more frequently hear about the ones who have struggled with drugs and the law. One died earlier this year. (I do better with people who weren’t in my specific class, as well as people from high school.) Time passes, places and people change as variations on a theme. I would like to move back to Vermont, should the opportunity present itself, but that seems like a remote possibility right now. At the same time, growing up in a rural town that had its largest population in the 1840 census informs what I do as a historian and teacher.

Writing this from my couch in Columbia, Missouri, I fear that I lost my thread. I wrote the opening sentences of this post on my phone from that wooded hilltop where I had no cell reception. All I had were a few lines, a couple of observations about the dead-end dirt road I grew up on, then and now, and a sense of omen that I couldn’t quite put my finger on about a sign. I still don’t know the conclusion, except that a launch pad is a dead end of another form.

The Plot Against America – Philip Roth

For most of my life Philip Roth’s novels have existed in an environment just beyond my radar. I knew about them in a general sense and was aware that he was held in high esteem as a literary author, but that is as far as it went. Then he died. After several podcasts I listen to did retrospectives of his career I decided I should change that.

The Plot Against America, Roth’s 2004 novel, is a grim alternate history that explores the issue of antisemitism in America.

The story takes place in the narrator’s (young Philip Roth) youth in Newark when Charles Lindbergh makes a surprise appearance at a deadlocked 1940 Republican National Convention and sweeps his way to the nomination. Lindbergh’s campaign frames the choice as between Roosevelt’s warmongering and American First, as he hops from city to city in his personal plane giving speeches on the airfield. Roosevelt, by contrast, is old-fashioned and traditional. Failing to appreciate the threat posed by Lindbergh, Roosevelt loses the election and retires from public life to his estate in New York.

For Roth’s Jewish family, the election is a disaster. Around every corner are people with anti-semitic opinions now empowered by the president and America-Firsters who regard Roosevelt’s globalist supporters as traitors. With the US committed to non-intervention, Philip’s cousin Alvin runs away from home to join the Canadian army to fight Hitler. Roth’s father begins listening exclusively to the left-wing demagogic radio personality Walt Winchell who loudly denounces Lindbergh as a fascist. Every action taken by the government is tinged with bigotry, he believes, the first step toward a pogrom.

The “Just Folks” program sends Jewish youths from urban areas to farms in the heartland. Philip’s older brother Sandy ends up in Kentucky for a summer working on a tobacco farm and returns a convert to the mission of the OAA—the Organization of American Absorption. Then Alvin returns, having lost a leg in combat. Further exacerbating tensions in the family is that Philip’s aunt Evelyn goes to work for Rabbi Lionel Bengelsdorf, the head of the OAA office in New Jersey.

The Plot Against America is presented as a retrospective of a dark episode in American history that both reveals a psychic scar in the country’s collective conscience and ends as abruptly as it began. Roth’s youth during the events described and the nature of conspiracy leaves it unclear what happened to bring Lindbergh to office, let alone what happened while he was there that leads to a bloody climax.

The national and historical developments create the backdrop for what is, ultimately, a family drama. The Lindbergh administration works to break up Jewish enclaves in cities like Newark, and the Roth family is split between those who hold to their convictions, such as his father, those who want to ignore politics, and the collaborators, whether out of naked opportunism or youthful naivete. The characters are vividly drawn, frequently in the graphic detail and sharp colors of youthful memory. There are good gentiles in The Plot Against America, much as there are bad Jews. In both cases Roth captures something fundamental to and fundamentally fragile in the soul of America.

Although it was published in 2004, The Plot Against America was an eerie read for 2018, right down to a Scandinavian summit where an American president with a fervent base is openly condemned for fawning behavior toward another foreign leader, leading commentators to ask what that leader has on the President. Similarly, American prejudices are papered over by a tradition of constitutionalism, but only barely, and there is a preference for collective amnesia rather than for resolution.

The Plot Against America> was hard to read, but rather than being a book that lost its edge since its publication, it is one that has only become sharper. That is probably too lofty a standard to set for when I get to Roth’s other books, but I can now say with certainty that I am going to be reading more.

ΔΔΔ

Next up, I just started reading A Brave New World. I read it in high school but remember nothing except a general sense of distaste. Like with Fahrenheit 451, I want to give it a fair shake.

First day fragments

My fall semester begins in earnest today, with the first session for both of my classes. I don’t have a single back-to-school post idea, but rather a bunch of loosely connected ones, so decided to go with a fragmentary format.

“I didn’t get everything done” is a standard lament for academics come late August, bemoaning some combination of the cult of productivity, human limitations, and the difficulties of researching during the school year. I am no exception. I set an ambitious schedule for reading scholarship beyond my immediate research, but only managed to read a handful of books and articles, and a couple of books on teaching.

There are a couple of explanations for this failure. One is that the summer quickly became very busy, with multiple family trips that had less down-time than anticipated, meaning that there was neither opportunity for reading nor for a deep recharge of my batteries. Another was that I taught an intensive summer World History course in June, so much of my spare reading went toward preparing for class. A third was that seemingly every spare moment around these time commitments was sucked up by working on revising my dissertation as a book. My goal for that was to have it under review by the start of class, but I missed that deadline, too. At least I am in a position to meet my revised goal of August 31 for that one…

ΔΔΔ

There has been a movement in recent years to normalize failure, particularly in academia, leading to people sharing their failures on Twitter over the last week. I mentioned there that I respect the movement, and appreciate the baseball analogy where if you’re a batter and only “fail” (make an out) at the plate six out of every ten times, you belong in the hall of fame. (There are obviously other statistics from baseball that could make that more or less extreme. If you’re a pitcher and batters swing and miss just 20% of the time, you’re incredible, but if that is the percentage of the time you throw strikes, then you probably quit playing in little league.) I respect the impulse to normalize failure because it is inevitably going to happen, regardless of how generous and kind the academy becomes. Everyone is going to experience article/grant/abstract/job/proposal rejections for a host of reasons. Sometimes those reasons are good (the project needs more work), sometimes they are petty, and a lot of the time is a simple numbers game that has almost nothing to do with what was proposed.

My shadow CV includes all of these things, including four article rejections, two more revise and resubmits that were later accepted, at least seven paper abstracts rejected that I can think of off hand, too many funding applications for fellowships and travel grants to count them all. And I am only a little more than a year removed from graduating with my PhD.

At the same time, I found the push to normalize, share, and celebrate failure on social media hard to handle. The main reason is that while failure is normal in the academy, and rejections can be handled deftly with an eye toward improving the project for the next time around, it is also a sign of privilege to be able to reflect on this Shadow CV. It is coming from someone still “in the game”, as it were, and I heard with every round of shares “this is what you *should* have been applying for.” As in, your failures themselves are inadequate because the “stars” fail bigger and better.

Then pair this with the part I left out of my Shadow CV that are the all jobs I’ve applied to without making the long list. The Shadow CV is meant to normalize failure so that people can better overcome the natural fear of it and thereby reduce anxiety, but when mixed with too few academic jobs to go around and the sheer amount of time that applying for them takes, it just exacerbated mine.

ΔΔΔ

I’m looking forward to teaching both of my classes this semester. One I am teaching my own syllabus for the second time, the other I am teaching as the sole instructor for the first time. I had the chance to teach on my own a little bit during graduate school, but this is my second year of continuously teaching my own courses and reading up on pedagogy, so I am now to synthesize some principles for my classroom.

First Principle: Learning, not grades. I do not care about grades beyond making sure that I have created a reasonable and achievable grade scale for the class. My goal as a teacher is to help students develop practical skills such as writing and the ability to understand the world through critical analysis and synthesizing information. Toward that end, I believe that many common assessment tools that are built for scale are next to useless in actually assessing learning. I design my classes around assignments that require students to develop arguments through writing and that build on each other so that students can show improvement in tasks that are not easy.

Second Principle: Empathy. Students are adults who have a larger number of demands on them than even I did when entering school fifteen years ago. I aspire to treat them like adults with responsibilities, just one of which is my class. College is “the real world” where students are on their own for the first time, and I want to be a mentor/coach/guide. This means having empathy, and encouraging them to take ownership of their education by talking with me when they have a conflict or need help.

Third Principle: Engagement. “Meaningful learning experiences” is a hot topic, though my mother assures me that this has been the key phrase for many decades now. Every class is going to be selective in the material it covers, so I see my job being to give students the tools to learn more and to pique their curiosity to want to do so. This means developing activities and assignments that require engagement, through games, debates, and projects where students take ownership of the material. This has not been the easiest task for me as someone who found history books thrilling in high school, but something that I am committed to improving in my own teaching.

There are others, but these are my first three.

ΔΔΔ

Without further ado, let the semester begin!

1491 – Charles Mann

The companion to Mann’s other book named after a year in the late 15th century, 1493, 1491 is a history of the Western Hemisphere before the arrival of Europeans, reporting on the best consensus of recent scholarship. Although he drying states at one point that his thesis is merely that this topic is worthy of more than seven pages, I think his argument is a good deal more sophisticated, namely that despite the popular myth that the Americas consisted of vast stretches of unspoiled nature, these continents were in effect vast gardens that had been shaped by millions of native inhabitants.

As was also true in 1493, Mann should be lauded for his lucid explanation of long-standing academic schisms. One of the problems with a book of this sort, as Mann notes, is that there are times when there is no consensus, in part because there are times when the sources are, shall we say, speculative. For instance, the chapter “Pleistocene Wars” is dedicated to wars between scholars over what happened during the Pleistocene, rather than wars that took place then. This is the chapter Mann gives to populating the Americas, the so-called Clovis Culture, and the possibility of multiple waves of migration. In this example, Mann delves into the controversies over dating the scattered bits of evidence, but in others he acknowledges more sinister problems with the evidence, such as how the European colonists eliminated the knowledge bases of the cultures they encountered.

You will note that I have not mentioned a single specific native group. Mann goes through many, though certainly not all, in some detail, but the themes are the same again and again. Native Americans (the collective term I still reflexively use, though Mann has an appendix dedicated to the problems with it) were technologically, mathematically, and agriculturally sophisticated in ways that are not often appreciated by people accustomed to European land-use patterns and intellectual culture, or who are deceived by giving priority to the empirical evidence of native culture that dates to generations after European contact.

The hemisphere described by Mann was teeming with human life in 1491, so densely populated that the colonists found themselves unable to stay. Within a few decades most of those people were killed by European diseases, which allowed laughably small numbers of men to conquer enormous swathes of territory with the help of native allies, particularly in South America, and allowed previously-controlled species like the bison and carrier pigeons to undergo explosive population growth—ironically shooting past the carrying capacity only to become associated with the natural bounty of the Americas. Mann also offers a welcome correction to the noble savage myth that Native Americans were endowed with a preternatural connection with the land, arguing instead that their ability to steward the environment developed from past failures and a willingness to develop sustainable practices.

In sum, I enjoyed 1491 a hair more than 1493, but they work in tandem to ask and answer some big questions about the history of the world.

ΔΔΔ

I had never given any thought to reading Philip Roth’s books until hearing people talk about his work after he passed away this summer and thinking that they sounded up my alley. I’m just now starting that process, with his alternate history The Plot Against America.

Fahrenheit 451 – Ray Bradbury

Cram them full of noncombustable data, chock them full of ‘facts’ they feel stuffed, but absolutely ‘brilliant’ with information. Then they’ll feel they’re thinking, they’ll get a sense of motion without moving. And they’ll be happy, because facts of that sort don’t change. Don’t give them any slippery stuff like philosophy or sociology to tie things up with. That way lies melancholy.

Once, years ago, I picked up this book, possibly to complete the triptych with 1984 and A Brave New World. I found it painfully dull at the time and never finished, until now. (I only have vague memories of being bored by A Brave New World, too, and should give it a fair shake outside of English class.)

Fahrenheit 451 is fundamentally the story of Guy Montag. Guy’s profession is “fireman”, his job is to burn contraband books, to prevent the spread of illicit knowledge. Houses these days are fireproof, but books still burn, so the firemen simply turn on their kerosene-spewing hoses. “It was a pleasure to burn,” Guy thinks in the opening line.

But Guy has a crisis of faith that is prompted by two events. First, Guy meets his neighbor Clarisse on the way home from work. Clarisse, he thinks, is a little bit strange, and so is her family. She walks places, for instance, and looks at the stars and the moon, and her family sits on their porch and talks to one another, rather than surrounding themselves with the usual immersive video screens. Clarisse asks questions that make him think. Questions like “are you happy?”

The second strikes to the heart of things, when Guy discovers one night that his wife Mildred has gone through her usual routine of putting on her seashells (headphones), but also consumed an entire bottle of sleeping pills, forcing him to call for medical aid to revive her. Instead of doctors, he gets technicians, who revive Mildred, but also callously dismiss it as a plumbing problem. When she wakes, Mildred has no memory of what happened and returns to her stories.

These two things cause Guy to reevaluate life and start to ask questions about the books he is sworn to burn. His crisis is kicked into overdrive when a woman decides that she is going to burn with her books. Despite the best efforts of Captain Beatty to rein in his man and Mildred’s horror at the changes in her husband, Guy becomes a pariah, an unlikely devotee of the written word and slips into a conspiracy to revive book culture.

While Fahrenheit 451 didn’t stand out as one of my favorite books, there was a lot I liked about the world Bradbury dreamed up for it. This is a world where people are surrounded by screens, but instead of the screens watching you or being watched, they become an immersive experience to make the viewer feel like part of the action. At the same time, Mildred seems to represent a facet of the existential emptiness that this “engagement” creates, particularly when juxtaposed with Clarisse’s habit of looking at the stars and talking with people in person. (I also appreciated that while Mildred and Clarisse represent a binary, almost allegorical choice between civilization and nature, Clarisse was never an object of sexual interest.) There was also a fascinating moment near the end of the book when Bradbury (perhaps unintentionally) opened the door to the return to an oral culture. Memorization of individual texts was offered as a way to legally preserve knowledge, with the idea that each person has a text that they could then pass down to another generation until such time that books were legal again. But any student of oral tradition could tell you that there is a tension between the amazing longevity of oral knowledge and the fact that it is not a static text the way that a book is. So my question is what do these texts look like in multiple generations?

Perhaps I’m just being contrary, but I did have a beef, not with the book, but with the marketing. The key conceit in Fahrenheit 451 is that people need to be sedated, calmed by unimpeachable facts and seduced by immersive stories. There is a war about to happen, so perhaps there is a government mandate on these policies, but it comes across as self-policing since it is a book about the people who burn books and the people who snitch on those who read books. Any totalitarian apparatus is largely invisible. Moreover, we are told that the problem with books is that they make people melancholic, confused and troubled by the contradictory ideas. Is this censorship? Maybe, but I think there is a difference between cutting a single book or parts of a book for expressing ideas deemed inappropriate, and burning all books for having ideas, while filling minds with advertisements, immersive soap operas, and anodyne facts that are the facsimile of thinking.

In sum, I liked Fahrenheit 451 and understand what makes it a classic, but it spoke to me less as a broad critique of society and more as a critique of its time of which there are still resonances.

ΔΔΔ

Things are starting to pick up since the semester starts next week and job ads starting to come out, but I am determined to keep reading. Right now, I am in the middle of Charles Mann’s 1491, the companion to 1493.