Learning to Run Again

This morning I woke up before my alarm. I grabbed my phone to turn that alarm off and checked a few things before getting out of bed. Then I puttered around the house, reading a novel and stretching by turns for a little more than an hour, just long enough to steep and drink a big mug of tea.

Then I laced up my running shoes and set out.

My current bout of running came on about a month and a half ago. I have never been as serious or successful a runner as my father and brothers who for a number of years now have run marathons together, but this is not my first time running. In high school, I would go for runs with my father and ran a few local 5k races. Early in graduate school I tried running again. It was during this period that I reached my longest distances, running about five miles at least once a week and topping out at about eight miles before running into a leg injury. I tried a “run the year” challenge a few years ago and contributed 173 miles to my team’s total, including a few miles when I couldn’t sleep early in the morning while on a job interview. Then injuries. I tried again after the pandemic closed the gym where I exercised. My last attempt, shortly after moving last summer (and, in retrospect, after holding my foot on the accelerator of a moving truck for many hours), ended abruptly with sharp pain in my lower calf less than a quarter mile into a run.

I am a slow runner, particularly these days. I am also not running very far—just a little under two miles today. But this is okay. My focus right now is on form. On my gait, and trying to keep it in line with how I imagine I run barefoot since I have suffered far more injuries while running in shoes than I ever did playing ultimate barefoot, which I did into my 30s. Correlation need not be causation, but so far, so good. I am running slow and careful, and celebrating ending each run for ending uninjured rather than for reaching a particular distance or speed. Those will come, but only if I can stay healthy.

I like the idea of running more than I actually like running. Rather, I would like to like to be someone who likes running, who achieves that runner’s high, who runs an annual marathon. But I spend my runs thinking about how everything hurts and, recently, fretting about whether this footfall will be the the one when something gives out and I have to start over. I can also only compete against myself while running, and pushing myself this way is exactly what I’m trying not to do.

By contrast, I used to play basketball for hours every week. My slowness didn’t matter as much in a confined playing surface where I could change speeds and understand the space. And since I didn’t like to lose, even in a silly pick-up game, I could just lose myself in the game and not think about what hurt.

And yet, running is what I have right now, so running is what I’m doing alongside a daily yoga routine.

My return to running also prompted me to finally pull Christopher McDougall’s Born to Run off my to-read shelf. McDougall describes himself as a frequently-injured runner, so I thought it might unlock the secret to running pain-free. In a way, it might have.

The centerpiece of Born to Run is a 2006 race in Copper Canyon in the Sierra Madre Mountains between a motley crew of American ultramarathon runners, including Scott Jurek, one of the best in the world at the time, and some of the best Rarámuri (Tarahumara) arranged by a mysterious figure called Caballo Blanco (Micah True).

(The race went on to become an annual event, though its founder died in 2012.)

It is an incredible story. Rarámuri runners had made their appearance in ultra-marathon circles at the Leadville 100, a high-altitude ultramarathon in Colorado, in 1993 and 1994. A guide and race director named Rick Fisher rolled up to the race with a team of Rarámuri for whom he was the self-appointed team manager. The Rarámuri runners won both years, setting a new course record in the second race, before deciding that putting up with Fisher’s actions wasn’t worth their participation.

(An article from 1996 in Ultrarunning about a race in Copper Canyon in which True also participated acknowledges Fisher’s “antics,” but points suggests that they didn’t end his relationship with the tribe.)

However, this story is the hook. Born to Run is an extended argument for a minimalist running style that exploded in popularity following its publication. McDougall’s thesis is that modern running shoes, and the industry that is predicated on selling those shoes, causes us to run in ways that cause injuries. This argument is somewhat anecdotal, relying on personal experience and stories of incredible endurance from athletes before the advent of running shoes.

The Rarámuri, whose name means “The Running People,” are exhibit A. The Rarámuri are a tribe that lives in isolated villages deep in the Sierra Madre Occidentals, in the Mexican state of Chihuahua. The terrain makes long-distance travel a challenge, so they Rarámuri run. But they also run for ceremony and sport in a ceremonial ball-game called rarajipara where teams work to kick a ball an agreed upon distance, chasing it down after each kick. All the while, runners wear just a traditional sandal called huaraches.

My own experience with running makes me sympathetic to McDougall’s argument, and I am seriously considering getting a pair of zero-drop shoes and transitioning in this direction for my footwear. However, the more I read about running injuries, the more it seems that the answers might be more idiosyncratic. That is, there is a lot of conflicting evidence. While some studies suggest physiological advantages to barefoot running, others point out that not all barefoot runners run with the same gait. A number of studies suggest that barefoot running has shifted the types of injuries (aided perhaps by people transitioning too quickly) rather than reducing them. I think that barefoot running could be good for me, but all of this makes me think that I shouldn’t ditch the running shoes for every run just yet.

While I was reading Born to Run, a friend suggested that I read Haruki Murakami’s What I Talk About When I Talk About Running, which connects my current focus on running with my ongoing obsession with writing.

In addition to being a novelist, Murakami is a marathoner and triathlete who describes how his goal is to run one marathon a year. This memoir is a collection of essays on the theme of running and training, and, unlike Born to Run, is not meant to be an argument for a particular type of training.

I think that one more condition for being a gentleman would be keeping quiet about what you do to stay healthy.

Nevertheless, I found What I Talk about When I Talk About Running to be particularly inspiring. Murakami is a more successful runner than I ever expect to be, even though I’m only three years older now than he was when he started running. And yet, I found something admirable about his approach. Running, like writing, is just something Murakami does, and he doesn’t think about a whole lot when he is on the road. His goal in running is to run to the end of the course. That’s it. He gets frustrated when he can’t run as fast as he used to, but he is not running to beat the other people, and uses the experience to turn inward.

And you start to recognize (or be resigned to the fact) that since your faults and deficiencies are well nigh infinite, you’d best figure out your good points and learn to get by with what you have.

But it should perhaps not come as a surprise that I highlighted more passages about writing than I did about running, though Murakami makes a case that the is broad overlap in a both a running temperament and a writing one. Both activities require long periods of isolation and where success is not synonymous with “winning.” Doing them is more important than being the best at them.

I don’t think we should judge the value of our lives by how efficient they are.

A useful reminder.

ΔΔΔ

I have had a hard time writing about books recently. Before these two books, I got bogged down in Olga Tokarczuk’s The Books of Jacob, which I am still trying to process, and then read Ondjaki’s The Transparent City, which is a very sad story about an impoverished community in Luanda, Angola. I would like to write about these, but I’m not sure that I have anything coherent to say and June has turned much busier than I had hoped—last week I was at AP Rating in Kansas City, then I wrote a conference paper that I delivered yesterday, and now I’m staring down a book deadline and other writing obligations. By the time I have time, I might be too far removed to come back to those books. I am now reading Christine Smallwood’s The Life of the Mind, which is a novel about adjunct labor and miscarriage in a way that highlights the lack of control in both situations.

The End of Burnout

Many authors tell people who already feel worn out and ineffectual that they can change their situation if they just try hard enough. What’s more, by making it individuals’ responsibility to deal with their own burnout, the advice leaves untouched the inhumane ethical and economic system that causes burnout in the first place. Our thinking is stuck because we don’t recognize how deeply burnout is embedded in our cultural values. Or else we’re afraid to admit it. Insofar as the system that works people to the point of burnout is profitable, the people who profit from it have little incentive to alter it. In an individualistic culture where work is a moral duty, it’s up to you to ensure you’re in good working order. And many workers who boast of their hustle embrace that duty, no matter the damage it does. In a perverse way, many of us love burnout culture. Deep down, we want to burn out.

I resemble this statement, and I don’t like it.

By the definitions established in Jonathan Malesic’s recent book The End of Burnout, I have never burned out—at least not completely. I have never reached a point of absolute despair that rendered me incapable of going on, which, along utter exhaustion and reduced performance, marks burnout. The other two, however…

I wouldn’t say that I worked hard in high school, at least on the whole. There were projects that I worked at and if something interested me I would work hard, but not so much overall. Midway through my undergraduate career something snapped. Seemingly overnight I became a dedicated, if not efficient student. I divided everything in my world into “productive” activities and unproductive ones and aspired to spend my waking time being as productive as possible. School work obviously counted as productive, but so too did exercise and investing time in my relationships. Spending time not doing things was deemed unproductive.

At first this was innocuous enough. I was young and productive time included fun things, right? My numerous and varied interests led to me to do all sorts of things and I was determined to do them all. By the time the second semester of senior year rolled around this was almost a mania: I was working, running a club, taking a full course load, working on two research projects, and auditing extra classes that just looked interesting to me, as well as exercising and generally spending time on the aforementioned relationships.

At a time when the stereotypical college student develops a case of senioritis, going through the motions while looking forward what was next, I somehow managed to define sleep as “not productive.”

Seriously.

I cringe thinking about it now, but I went through most of a semester averaging about three hours of sleep a night. I don’t think I ever pulled an all-nighter, but most nights I only got one or two hours, going to bed around midnight, getting up at 1:30 so I could grab coffee and food before the late night place closed, work until the gym opened, exercise, shower, go to class, and then either go do homework or go to my shift at work. I would get eight hours or so on Fridays after work and whatever recreational activities I had planned. Several people that I know of had conversations about when I was going to collapse, though not within earshot. It was bad. Trust me when I say that you shouldn’t do this.

According to the journal I kept at the time, under an April entry titled: “I guess I did need to sleep,” I slept for 13 hours straight.

I have never done something this self-destructive since, but there have been numerous times that I have edged in that direction.

  • The year after college I ended up working up to 90 hours a week, often for weeks at a time without a day off until I just couldn’t physically keep it up, at one point sleeping for more than 12 hours and forcing myself to take days off, even if the nature of the job made that difficult.
  • I worked almost 30 hours a week on top of my school responsibilities (a “full” course load and grading for a class) while completing my MA.
  • I nearly lost snapped while completing the work for one of the toughest seminars I took in grad school the week that I was also taking my comprehensive exams.
  • Another semester, while cobbling together jobs as an adjunct, I took on so much work (six classes, one of which was nearly twice as much work as I thought when I accepted it) that I had to stop writing entirely just to stay on top of the teaching.
  • The semester after that I developed (probably anxiety-induced) GERD and broke out in hives.
  • I frequently have to remind myself that taking one day off a week is okay, leave alone two. At least I usually sleep 7–8 hours a night these days.

Lest it sound like I’m bragging, these are not badges of honor. They are symptoms of the perverse relationship with work that Malesic describes, wedded with ambition and an anxiety oscillates between imposter syndrome and a deep-seated fear that I’ll once again become someone who does nothing if I let up even a little. The worst part: my behavior place within systems that celebrate discipline, but it was almost entirely self-inflicted.

However, I have never burned out like Jonathan Malesic.

Malesic had achieved his dream of becoming a tenured professor of religion and living the life filled with inspirational conversations with young people that he imagined his own college professors had lived. But that life wasn’t as great as he imagined. His students were apathetic, the papers uninspired and, at times, plagiarized. There were meetings and committees, and his wife lived in a different state. In short, the job didn’t live up to his expectations, which, in turn, caused his life to fall apart. His job performance lagged. He snapped at students. He drank too much and found himself incapable of getting out of bed. And so, eventually, he quit.

The End of Burnout is an exploration of the forces that caused his disillusion with his job and possible solutions to escape it. Put simply, Malesic’s thesis is that two features of the modern workplace cause “burnout.”

  1. People derive personal meaning and worth from their jobs.
  2. There is a gulf between the expectations and reality of those jobs.

That is, there is a broad expectation in the United States that your job determines your worth to society. This is obviously not true, but it is signaled in any number of ways, from making health insurances a benefit of employment, to looking down on “low status” jobs like food service, to the constant expectation that you ought to be seeking promotion or treating yourself like an entrepreneur. But if your worth is wrapped up in your job, then you might enter with a certain set of expectations that are out of sync with the conditions—doctors who want to heal people and end up typing at a computer all day, or a professor who got into teaching because of Dead Poet’s Society and ends up teaching bored, hungover students in general education classes. On top of it all, the responsibility for “solving” the issue is then passed on to the worker: you’re just not hustling hard enough. Have you tried self-care?

The End of Burnout is a thought-provoking book. Malesic examines the deep historical roots of phenomena that might today be called burnout, discusses the pathology of an ambiguous phenomenon that is likely overused, often pointing to acute exhaustion rather than true burnout, and explores how social pressures (e.g. the moral discourse that equates work with worth) exacerbate the phenomenon before turning to alternate models of work and human dignity.

I picked up the end of Burnout for a few reasons.

Most obvious, perhaps, is my toxic relationship with work, as outlined above, to the point where I thought that I had burned out on multiple occasions. Based on the descriptions Malesic provides, I was usually acutely exhausted rather than truly burned out, with the result that, at least so far, I have always been able to bounce back with a few weeks or months of rest.

(The one exception might be the restaurant work straight out of college, but even that did not stop me from working in another franchise in the same chain for two more years while attending school.)

Cumulative exhaustion can lead to burnout, but I came away unconvinced that I have even really been walking down that path. I have been frustrated, of course, and can tell that I am creeping toward exhaustion when I start excessively doom-scrolling on Twitter, but I did not relate to the sheer disillusionment Malesic described. When I have considered other employment options over the past few years, it has always been because of a dearth of jobs.

The main difference, at least to this point, is that I have never viewed this job through rose-colored glasses. Writing about history is something I see as a vocation, but I have approached the teaching and associated work as a job, albeit one that aligns with those other aspects of my life and thus is more enjoyable than some of the others I have had.

At the same time, I have noticed a shift in my relationship to hustle culture now that I am in my mid-30s. I still work hard and have certain ambitions, but increasingly, they are around finding ways to spend my time reading, writing about things I find interesting and important—and having employment with enough security, money, and free-time to do that.

Likewise, the idea of treating oneself as an entrepreneur, which Malesic identifies as an element connecting worth to employment, has always left a sour taste in my mouth. When people tell me that I could (or should) open a bakery, I usually shrug and make some polite noises. I have managed a restaurant in my life and have very little interest in doing so again. I bake because I like the process and enjoy cooking for people I like, not because I want to turn it into a business with all of the marketing, bookkeeping, and regulations that would entail.

(I have also considered trying to turn my writing into a subscription business, but I find that incompatible with the writing I do here. If I made a change, it would involve some sort of additional writing with a regular and established schedule—say, a monthly academic book review for a general readership with a small subscription fee designed to cover the cost of the book and hosting. A thought for another day.)

However, I also picked up The End of Burnout because I am worried about the effect that this culture has on my students. Nearly every semester I have one or more students who report losing motivation to do their work. This past semester one student explained it as a matter of existential dread about what he was going to do with his degree, but it could just as easily be anxiety or concern over climate change or the contemporary political culture or school shootings.

I have long suspected what Malesic argues, that burnout is systemic. In a college context, this is why I get frustrated every time a conversation about mental health on campus takes place without addressing those systemic factors. Focusing on the best practices and workload for an individual class is (relatively) easy, but it is much harder to account for how the courses the professor is teaching or the students are taking interact with each other. I am absolutely complicit in this problem. One of my goals for next academic year is to reexamine my courses because the reality is that the most perfect slate of learning assessments is meaningless if the students end up burned out. I can’t fix these issues on my own, but Malesic’s book brought into greater focus why I need to be part of the solution for my own sake and my students’. I don’t ever want to let one of my students make the mistakes I did when I was their age, which probably explains why the most common piece of advice I give is “get some sleep,” and I can’t help them if I am also in crisis.

The back part of The End of Burnout turns to possible solutions. Perhaps unsurprisingly given his background as a professor of religion, this discussion frequently focused on groups with a Christian bent. He spends a chapter, for instance, talking about how various Benedictine communities apply the Rule of St. Benedict to tame the “demon” of work. Some groups strictly follow the Rule, limiting work to three hours so that they can dedicate the rest of their lives to what really matters, prayer. Other groups, like several in Minnesota, were less rigid, but nevertheless used similar principles to divorce work and worth, and allowing one’s service to the larger community change with time.

The other chapter in this section was more varied, and included useful discussion from disability activists, but it also featured a prominent profile of Citysquare, a religious-based Dallas non-profit that uniquely humane policies around work expectations and support for its staff. These examples sat awkwardly with my agnostic world view, as someone who believes that we should be able to create a better society without religion, and particularly without Christianity. However, Malesic’s underlying point is not that we ought to all follow the Rule of St. Benedict. Rather, he makes a case that each profile in its own way can help imagine a culture where the value of a person is not derived from their paycheck (or grade).

To overcome burnout, we have to get rid of the [destructive ideal of working to the point of martyrdom] and create a new shared vision of how work fits into a life well lived. That vision will replace the work ethic’s old, discredited promise. It will make dignity universal, not contingent on paid labor. It will put compassion for self and others ahead of productivity. And it will affirm that we find our highest purpose in leisure, not work.

Malesic’s vision here is decidedly utopian and hardly new, and his warnings about the consequences of the automating workplace are a modern echo of 19th century choruses. But the ideals he presents are worth aspiring to nonetheless. As long as we work within a depersonalizing, extractive system that treats people as interchangeable expenses against the company’s bottom line, then that system will not only continue to grind people down and spit them out, but also contribute to nasty practices elsewhere in society like treating food service workers with contempt. Severing the connection between personal worth and paid work won’t solve every problems, but it is a good place to start.

Lost & Found

Memoir is a genre that I mostly avoid. If one were to ask why I only read one or two memoirs a year, I would wave generally at the idea that the intimate details of someone’s life are not really of interest to me, but the reality is that I almost always enjoy the handful that I do read. The truth is that I enjoy process stories, so memoirs like Anthony Bourdain’s Kitchen Confidential, David Chang’s Eat a Peach, and Dessa’s My Own Devices are very much my thing, and I can appreciate a good writing about heritage or society, as in Daniel Mendelsohn’s The Lost and Ta Nahesi Coates’ Between the World and Me.

Maybe I just don’t like the idea of the genre. Clearly, the aversion isn’t borne out in practice.

When I first came across Kathryn Schulz’ Lost & Found through Keith Law’s podcast I initially hesitated. I knew Schulz could write—she has a Pulitzer for her feature on the risk of a catastrophic earthquake in the Pacific Northwest—but the subject of her memoir, losing her father and falling in love, seemed to follow all of the stereotypes of the genre that leave me cold when I look at lists of memoirs that critics deem “the best.”

Then Shulz started talking about her father. Before the podcast had finished, I had acquired Lost & Found as an ebook from the library.

Lost & Found is divided into three thematic sections that unfold in loose chronological order: Lost, Found, and, as one might expect from the title, &.

We lose things because we are flawed, because we are human, because we have things to lose.

Lost is, predictably, a story of loss. But it is also a story imbued with the deep love of family. This opening section is about her father, a Jewish immigrant who moved from Łodź to Tel Aviv and then to the United States. Schulz describes her father as an erudite, intelligent man with an insatiable curiosity about the world and the people who live in it. He was also someone who habitually lost his wallet and other simple objects. Her love for him radiates from the page as she weaves his story with the heartbreak of losing him and meditations on the existential imperative of loss.

Of all the things that can make finding something difficult—false positives, false negatives, moving targets, incorrect search areas, lack of resources, the vagaries of chance, the general immensity of the world—one of the thorniest is this: sometimes, we don’t really know what we’re looking for.

Found, a natural complement to Lost, is a story about falling in love, written as though it is a meet-cute.

The first meeting took place in the Hudson River Valley where Schulz was living, alone, when friends introduced them. C lived hours to the south and stopped by when she passed through on a separate trip. The first date stretched into hours. The second lasted even longer. Schulz says that she had already decided to marry C, despite their differences.

Found is both saccharine and overflowing with joy. Schulz fills these pages with the exhilaration of falling in love—the long dates, the thrill of discovering an unexpected shared love (country music, in this case), the dawning realization that you don’t want to spend your life with anyone else. And, of course, learning how to communicate in a relationship with another person who is, by the nature of existence, different from yourself.

All of this is also made all the more profound given that it happens concurrently with the loss of her father.

The astonishment is all in the being here.

& is the story of being, about joining lives and the choices that get made along the way. It starts with a discussion about the meteor that created the Chesapeake Bay and the Delmarva Peninsula where C grew up, in a life that could hardly be more different from Schulz’s own upbringing suburban Ohio. It reaches its climax at a wedding in the same region.

Some published reviews of Lost & Found remark that Schulz’ proclivity for wonderment borders on the tedious, but it worked for me. Schulz is not impervious to the crushing weight of contemporary events—she describes the concern that C’s extended family might raise objections to their marriage at their wedding, for instance—but she fills page after page of Lost & Found with reminders to seek joy in being because loss is a certainty.

ΔΔΔ

I am expecting that I will write more of these posts (along with a number of other posts) now that the semester is coming to an end. Most recently, I finished Yrsa Sigurdardóttir’s The Silence of the Sea, an Icelandic thriller. I am now reading Babylon’s Ashes—the sixth book in the series—which means that I am likely going to finish the last three books this summer.

Smashing Statues

I have a draft blog post from a few years ago where I tried to grapple with my thoughts about monumental statues. The thrust of the post explored how statues are neither mere art nor monuments imbued with an immutable meaning. Rather, they are objects of memory and part of a dynamic process by which history, culture, and commemoration are woven. Their meaning emerges from decisions about what ought to be commemorated and how, so what gets evoked will not only change over time, but will also vary from person to person. They are always contested.

The post worked toward a discussion of the Emancipation Memorial created by Thomas Ball in Washington DC in 1876 and with a copy (formerly) in Boston.

I hate this monument.

Emancipation is a wonderful thing to commemorate, of course, but this is also a monument that shows Abraham Lincoln with his one hand holding his proclamation, which rests on symbols of federal authority like the fasces, his other held beneficently over the back of a barely-clothed black man. Abe towers above, looking on placidly.

Emancipation Memorial, Lincoln Park Washington DC (Wikimedia Commons)

I never finished that post, obviously. I was writing a collection of loosely-connected thoughts and I ran out of steam.

Erin Thompson’s Smashing Statues: The Rise and Fall of America’s Public Monuments (Norton 2022) makes the argument I was trying to articulate in that post, only, you know, better.

Smashing Statues consists of two parts—four chapters on monuments going up, and four chapters on them coming down—all of which build from a simple premise:

American monuments were built to show us our place within national hierarchies of power. Regardless of our race, they tell us to sacrifice ourselves to the interests of those more powerful than us.

Thompson organizes each chapter around the story of one monument or type of monument as a way of exploring the disconnect between how they went up and the authority and reverence with which they are sometimes received.

For instance, Chapter 3 (“Shafts”) unpacks the history of Civil War monuments showing, among other things, how the most common monument was one that shows a common soldier at parade rest. That is, a monument that celebrates not the soldiers who died or the sacrifices of the living, but the obedience of the soldiers who fought for the cause.

My favorite chapter in the book, and one I’m considering assigning in class in the fall, is “A Shrine for the South,” which details the creation of Stone Mountain in Georgia in the 1920s. This site was intended to be the shrine described in the title, with a ghostly army of Confederate heroes riding along the mountain face. She starts the chapter with the story of how the sculptor Gutzon Borglum took an ax to the model head of Robert E. Lee, declaring that the project was a scam by the KKK to siphon off funds—before revealing that he was Klansman upset that he was being cut out of the profits.

To my mind, this chapter put on display all of the fissures involved in these monuments. Borglum joined the Klan, but he lived in Connecticut and was so enamored of Lincoln that he named his son after the dead president. The project relied on the Lost Cause mythology, but it also grew continuously because this was the surest way to secure additional funding. And, of course, Thompson concludes with a discussion of the project Borglum moved to from Stone Mountain: Mount Rushmore.

Smashing Statues is a quick read. Thompson is an art history professor with a special interest in the destruction of cultural heritage, and this book is based on her numerous articles on the topic of monuments since the summer of 2020. But it is no worse for the sense that it is a series of interconnected essays. The core message comes through like a clarion call, and not a moment too soon.

Traditional monuments put heroes on pedestals to tell us our troubles are over. They are our nation’s selfies—perfectly posed and cropped to show only our best angles. They cover up complications and give a too-rosy view of the past and the future. We need debates, not pieties. we need to question our past in order to remake our future. If monuments try to keep us where we are by holding up examples of impossibly perfect people, well, maybe we don’t need them at all.

Addendum: in the few minutes past when this post went up, a digital friend brought to my attention how these same tactics have been weaponized in Latin America. Smashing Statues is fundamentally oriented toward the the politics of commemoration in the United States where most these statues uphold the traditional political order. But in the sense that statues are not value-neutral, the instinct to tear down can be weaponized against monuments looking to establish a pluralistic vision of the future in much the same way that “heritage” can be used as a rallying cry to preserve those that enshrine the existing political order. Attacks that symbolically lynch the commemorated subject not only assert a political order, but also serve to intimidate the communities that would dare erect the monuments in the first place. This is because, as Thompson argues throughout Smashing Statues, monuments serve as an arena that reflects political debates in society writ large.

ΔΔΔ

I am hoping to write more in this space now that my semester is drawing to a close. This will likely include some book posts like this one where I give a few thoughts, if not a full review. Since the last one of these went up, I have also finished Michael Twitty’s The Cooking Gene, Tasha Suri’s The Jasmine Throne, and Kathryn Schulz’s Lost & Found, and I am now reading Yrsa Sigurdardóttir’s The Silence of the Sea.

Bring the War Home: The White Power Movement and Paramilitary America

Black and white image of the cover of Kathleen Belew’s Bring the War Home.

On January 6, 2021, a crowd people stormed the US Capitol Building in order to stop the certification of the electoral votes that made Joe Biden president. This was the result of actions meant to undermine faith in election and polarization heightened by the present media ecosystem, but it was also the culmination of decades of growing extremism among white nationalist and anti-government militia movements. That growth is the subject of Kathleen Belew’s Bring the War Home: The White Power Movement and Paramilitary America.

While there has been a pronounced strain of separatism in the United States as long as there has been a United States, Belew identifies the modern iteration in the resolution to the Vietnam War in the 1970s. White power was at the heart of the militia movement from its inception, but she argues that the perceived betrayal in Vietnam prompted a very specific metastasis beyond bog-standard racism. It prompted people like Louis Beam to form militia groups with the stated intent of continuing the war. Naturally, they found common cause with groups like the Knights of the Ku Klux Klax that David Duke founded in 1975.

In these early ears, the militia movement claimed to be fighting against insidious forces and on behalf of the United States. They were soldiers taking the war into their own hands. However, Belew traces how this resentment and frustration transformed over the course of the 1980s until their orientation had turned 180 degrees. By the start of the 1990s militia groups operating around the country–and not merely at places like Ruby Ridge–saw themselves as soldiers in a war on behalf of white people against the United States, which they referred to as the Zionist Occupation Government. She concludes with a chapter on Timothy McVeigh and his terrorist attack in Oklahoma City on April 19, 1995, though that incident clearly did not put an end to the movements Belew documents is documenting.

At this point, I feel like I need to offer a caveat. I finished Bring the War Home a month ago and while I take copious notes on the books I read for “work” take only haphazard notes on books that I read for “fun.” This book technically falls in the latter category even though parts of it will undoubtedly make its way into my US history classes. I meant to write this post within a day or two of finishing the book, but it turns out that writing here is a lower priority than, say, my classes or work on academic publications. All of this is to say that the following analysis is going to be more a reflection on what I saw as a couple of key themes and less an actual review.

The first thing that stood out to me in Bring the War Home was how Belew traces multiple loosely-connected organizations joined by a common sense of purpose and sometimes, marriage. The various groups saw themselves as part of the same conflict and Belew shows how they used the early internet to support one another, but the absence of a hierarchy meant that quashing one did nothing to slow the spread of the movement. In fact, efforts by the federal government to address the militia movement in places like Ruby Ridge only galvanized other cells and sympathizers. This part of the book sometimes meant trying to keep track of a web of names, but it effectively highlighted the challenge of addressing the militia movement.

Second, perhaps the most striking chapter in Bring the War Home was “Race War and White Women.” In this chapter, Belew shows how white women were of central importance to the militia movement. That is, they claimed to be defending the virtue of vulnerable white women who, in turn, were expected to bear white children. These vulnerable white women were both an abstract ideal, rather like love interests in D.W. Griffith’s Birth of a Nation, and people who played a concrete role in spreading the militia ideas. In the case of a the Fort Smith sedition trial in 1988 that ended with the jury rendering a not guilty verdict, two of the white women on the jury subsequently entered into public relationships with defendants.

(One of the key witnesses in that trial went on to murder three people at Jewish centers in Overland Park, Kansas in 2014.)

Bring the War Home is a terrifying book in many ways. It brings into focus a strain of extremism in the United States that has been steadily growing in prominence in the past few decades. This movement coalesced around racism, anti-semitism, and christian identitarianism, took advantage of new forms of media new media, and, as Belew put it on the first anniversary of January 6, ruthlessly seizes any opportunity. And yet, while these militia movements have themselves shed blood in their war against ZOG and fully intend to do so again, I can’t help but feel that their presence reveals a bigger and more insidious danger. The militia movement emerged from a specific knot of beliefs, but its growth and evolution stems in no small part from how many people not directly affiliated with any tentacle of the movement express sympathy for their positions. That is, the militia movement won’t win its war through force of arms, but through a steady campaign of radicalization that plays on preexisting prejudices. The fact that their ideas can be found elevated into nearly every level of government demonstrates that it is working.

ΔΔΔ

Crunch time on getting my book together meant giving almost all of my spare time to that, but I have still been reading a little bit every day because it helps me feel normal. Since my last one of these posts I finished Trevor Strunk’s Story Mode, a literary analysis of video games that had some interesting things to say about the evolution of games and Sofia Samatar’s A Stranger in Olondria, which had a gift for rich descriptions of place and with a clever story structure but that I ultimately found disappointing in terms of the characters and how the plot was written, James S.A. Corey’s Nemesis Games (Expanse, book 5), and S.A. Chakraborty’s Empire of Gold. I intend to write about the latter two series at some point. Currently, I am reading Tasha Suri’s The Jasmine Throne.

On Revision

Most drafts contain wonderful things, and most drafts don’t show off those wonders effectively. Some drafts are dull. Some are poorly organized. Some aren’t sure who they’re written for. Some seem unclear about the distinction between dutiful summary and original insight. Some hope that writing pyrotechnics might dazzle or sheer bulk equate to authority.

An open secret: it’s OK to be scared by the responsibilities of writing and revising, at least sometimes. Many ideas fizzle, either because the writer can’t concentrate on them long enough to blow a spark into a flame, or because the idea itself doesn’t have the strength to become more than a hunch. So let’s work with the anxiety.

I started getting serious about writing in the course of writing my dissertation. This is not to say I paid no attention to the craft of writing before that point. I have been an avid reader most of my life, which has given me a decent ear for good prose, and I always aimed to produce good work, but I also generally distinguished between the history on the one side and the writing on the other. I spent hours in coffee shops polishing my MA thesis—I even got a compliment on the writing from one of my committee members for my trouble—but I was nevertheless committed to the idea that I was not a good writer. 

Sometime during the process of producing my dissertation, an unwieldy monstrosity that received no plaudits for style, I came to appreciate a closer connection between the historical research and the process of articulating the arguments. I started to read books on academic writing and started to integrate writing into how I teach history.

And yet, I never picked up a book by William Germano, one of the doyens in the field of academic writing whose From Dissertation to Book is a standard text for grad students looking to publish their first book. After reading his latest book, On Revision, I might have to return to that text even if I am nearly finished with the eponymous process.

On Revision is, in one sense, an entirely redundant book on writing. Any book in the genre worth its price will repeatedly point out to the reader that writing a bad first draft means that you now have a piece of text to improve. And yet, this can be a difficult lesson to learn. For this reason, Germano’s book represents an attempt at shifting the entire mindset: revision not as a necessary part of a larger process, but revision as the only part of writing that really matters. 

Germano establishes what he means by revision early on:

Correction is not revising. There’s no bigger misunderstanding about how writing gets to be better. Correcting is small, local, instant….It’s easy to confuse fixing errors with revising ideas and reconfiguring the shape of the text.

In the sense that I also aim to teach writing to my students, this was a welcome disambiguation. I often idly correct grammar and punctuation while grading papers because I do think these are important things for students to become aware of (and because I have this recurring fear that someone will review a book I write by just listing the myriad of typos), but I also point out that not all of my comments are created equal. Mechanical corrections are fine, but I am much more interested in how they revise their ideas and arguments. The question I keep coming back to is how to convey this necessary process to my students within the strictures of an academic calendar. On Revision can’t help me with the structural parts of my courses, but has given me food for thought in terms of how I articulate revision to my students.

On Revision opens with a short introduction and a chapter (“Good to Better”) that makes a case for revision generally and offers nine principles to get started. From there, Germano investigates four essential rules for revision that put those principles into action.

Germano’s first rule is simply to “know what you’ve got.” This might sound tediously banal, but in order to revise a piece of writing, you need to know what you are writing toward. This means carefully reading what you have written and taking stock of what it is you are trying to do with the piece.

In one of my classes this semester, I ran an activity where the students reviewed something I have been working on for a while now. I like the argument, but it has a fatal flaw as it is currently constructed: I don’t know what it is. This was a piece that started as a draft blog post before becoming a possible conference paper, and then an article that might work for a video game journal or a classical reception journal, before finally becoming a public-facing article. This circuitous route is in part because I don’t know what I have other than perhaps a point that missed its period of relevance. As I explained to my students, this means that I have a lot of revision ahead of me.

The second rule is looking for and highlighting your argument—or, as I tell my students, making it clear what you are trying to prove. I couldn’t help but laugh when Germano declared “A lot of academics…stop at simply indicating aboutness. “My book is about economic inequality.” That’s not an argument.”

I laughed because this is very similar to a mini-lesson on thesis statements that I gave to each of my classes this year after my first round of papers came back with a very five-paragraph type of non-thesis that restates the prompt with three sub-topics loosely related to the topic.

Stating the topic of an essay is easy. Articulating your argument compellingly and concisely is hard, if for no other reason than that it requires you to take ownership of what you are saying. Trust me, it took me forever to find a way to explain the argument of my dissertation (now book) project without rambling incoherently. Even now I only do so with any amount of success about 75% of the time and have only done it perfectly two or three times. I hope one of those is in the manuscript itself.

Germano’s third rule is about revising with an eye toward the architecture of a piece. That is, thinking about the order of the information and the internal coherence of the argument. Thinking in these terms, I have discovered that I have a particular affection ring structure within my work, often opening with some anecdote that illustrates the argument I am trying to make and that I can call back to in the conclusion.

Finally, Germano calls on his readers to attend to their audience. If you are asking readers to give you their time (and often money!), then they are going to expect your attention in return.

In each rule, Germano offers illustrative examples and, usually, helpful exercises to perform on your writing. My favorite, from the architecture of the piece, echoes a piece of advice I have been giving my students for years. He calls it “The Writing W” based on the constellation Casseiopeia or “The Wain.” The constellation has five stars that look loosely like a W. Following this path, the writer has something to do at each stop. First, write your opening move, then write the conclusion. Then you fill in the gaps between the two with everything you might need to support the argument and lead to the conclusion. Then you write the conclusion again, adjusting based on the evidence. Finally, re-write the opening paragraph.

I don’t teach comp, so my exercise is less formulaic, but it follows a similar principle: the introduction should be the last thing you write. It can also be the first, and I am certainly the sort of writer who likes working through an idea from the beginning to end except on exceptionally long pieces, but I preach to my students that the process of writing a paper will often change your ideas about your topic, so you should be prepared to adjust what you wrote accordingly.

On Revision is a hard book to write about succinctly. It is filled with principles, techniques, and encouragement and while I am hard-pressed to come up with anything that I didn’t already know or do, its virtue is in how it articulates this essential process. After one read-through, my copy is filled with post-it notes drawing my attention back to individual passages or ideas. and that alone speaks to its value. But, beyond that, Germano’s authorial voice is that of a compassionate mentor who wants to see your work become the best it can be. I might hate reading my own writing, but he is here to say:

It’s OK not to reread one’s work when it’s done done, but revision is the crucially important process by which you get your work to that point.

ΔΔΔ

I am way behind on my intended posts right now, but I have continued reading apace. Ruth Ozeki’s The Book of Form and Emptiness is as beautiful and traumatic as her A Tale for the Time Being, which is one of my all-time favorite novels, but maybe just a little bit behind in my personal estimation. I also recently finished Ayse Kulin’s The Last Train to Istanbul, which is based on real accounts of Turkish diplomats trying to save Jewish Turks (and non-Turks) from the Holocaust. I didn’t think it worked perfectly as a novel, but I want to know more about the history. I also read Brief Lives, the seventh installment in Neil Gaimon’s The Sandman, and am now reading the second volume in The Expanse series, Caliban’s War.

Two Takes on Social Media

The algorithm that serves as Facebook’s beating heart is too powerful and too lucrative. And the platform is built upon a fundamental, possibly irreconcilable dichotomy: its purported mission to advance society by connecting people while also profiting off them. It is Facebook’s dilemma and its ugly truth.

I joined Facebook in 2004 in my Freshman year of college, deleted that account in 2012, and then rejoined the Facebook orbit with an Instagram account a few years later. (I dislike Facebook, but Instagram preserves the parts I liked without most of the noise and lies behind my growing interest in photography.) Along the way I picked up and discarded a variety of other social media accounts, most notably Twitter.

In short, my entire adult life has coincided with the era of social media.

2021 has been the year when social media finally made its way into my reading, starting with Fake Accounts earlier this year. Recently I added to this theme two more books published this year, Tahmima Anam’s The Startup Wife and Sheera Frenkel and Cecilia King’s An Ugly Truth.

I read the fiction first.

The Startup Wife is a send-up of start-up culture. Asha Ray is a brilliant coder working on a PhD on neural networks that seems to be going nowhere when she reconnect with Cyrus, the boy she had a crush on in high school. For his part, Cyrus is different. He spends his time wandering from reading and absorbing ideas, but also lives with a friend, Jules, who has a trust fund. Yet, people gravitate to Cyrus to create unique rituals. Asha likewise finds herself in Cyrus orbit, as well as his bed.

Soon, Asha drops her PhD to begin coding a new project: an algorithm that will harness Cyrus’ preternatural gift for ritual. With Cyrus’ mind, Jules’ money, and Asha’s code, the three found WAI (pronounced “why”), which stands for “We Are Infinite” and get inducted into a startup incubator, Utopia, that is preparing for the end of the world. As WAI begins to catch on, Asha faces the personal and professional challenges that come with managing a start-up—everything from how to monetize this platform without selling out to being forced to share her husband with everyone on the platform.

Tahmima Anam writes from the experience of her husband’s start-up company, lending believability to the steps taken to seeking capital, even when the specific details of the meetings are absurd. Likewise, this background infuses the story with the frustrations of a woman who has had the distinct displeasure of hearing how women get talked about in the startup world and of being overlooked in board meetings.

The post-IPO wife is the butt of many of our jokes. We’d been tetchy when that first lawyer brought it up (Your odds aren’t good!), but now that Cyrus knows more of these people, we realize Barry wasn’t singling us out, because divorce after great success is actually a trend. Not a dirty little secret but like a totally sanctioned and okay thing that men do once they hit the big time.

The personal side of The Startup Wife—Asha’s marriage and her frustrations with startup culture—provide both the comedy and the emotional resonance of the book. The WAI algorithm, by contrast, provides the depth. The premise of the site is simple:

We have devised a way of getting people to form connections with others on the basis of what gives their life meaning, instead of what they like or don’t like.

The founders of WAI are all generally well-intentioned, but what does it mean to do no evil? Obviously this precludes physical hard and predatory behavior, but does it extend to keeping the platform free? What about keeping profiles active after the owner dies? How much editorial control should Asha and the team exert over the community?

Ultimately, The Startup Wife is better at raising questions than answering them, but it nevertheless offers a romp through this world that is troubling and funny in equal parts. An Ugly Truth, by contrast, is just troubling.

Frenkel and King lay out thousands of hours of reporting in this new exposé of Facebook that tracks the last decade of its existence. The story opens with Facebook cresting a wave in 2012—ironically about the time I deleted my account. Sheryl Sandberg had joined the board and was successfully monetizing Facebook’s algorithm. Facebook still touted its utopian vision for society, but amid the obsession with growth lay the seeds of something darker—questions particularly about speech given that Facebook’s algorithm capitalized on engagement and amplified anything that received an emotional response.

Facebook technically barred hate speech, but the company’s definition of what constituted it was ever evolving. What it took action on differed within nations, in compliance with local laws. There were universal definitions for banned content on child pornography and on violent content. But hate speech was specific not just to countries but to cultures.

By the 2016 election, Facebook hit a crossroads. Zuckerberg and his inner circle resolved to be scrupulously impartial in order to counteract accusations that they were partisan when, in truth, growth and engagement were the guiding stars. Partisanship was good for business, but it also led to discontent in the ranks among some staff who saw the site as stoking divisions and others who were ostensibly hired for security but then sidelined. Around the same time, rumbling started in Congress about regulations.

Zuckerberg responded to criticism by reaffirming his faith in Facebook’s ability to regulate itself with algorithms and circling the wagons. Instagram and WhatsApp were integrated into Facebook to make them harder to spin off and Facebook proper doubled down on privacy and private groups. According to the people Frenkel and King interviewed, the latter was a particular problem not only because it led to the rampant growth of conspiracy theory groups, but also because Facebook’s transparency was the very feature that allowed the site to help root out child pornographers.

Research had shown that people who joined many groups were more likely to spend more time on Facebook, and Zuckerberg had hailed groups as the type of private, living room chat he thought his users wanted to see more of. But he was growing disturbed by the number of people joining groups dedicated to conspiracy theories or fringe political movements, rather than the hiking clubs and parenting communities he had envisioned.

Facebook has nearly three billion monthly users and enormous amounts of influence. In An Ugly Truth, Frenkel and King make an argument that Facebook’s naive optimism that the truth winning out over misinformation belies how social responsibility is incompatible with the mandates of growth and profit. In other words, An Ugly Truth is the answer to the questions raised in The Startup Wife.

ΔΔΔ

I recently finished reading Nicholas P. Money’s book The Rise of Yeast. I hoped to glean information about beer and bread, but Money was more interested in the structure of yeast and biofuel—perhaps because he is a biochemist, as well as Leviathan Wakes, the first of The Expanse books. As a fan of the TV series, I am stewing over why I didn’t react as negatively going from TV to book as I usually do going book to series. I am now reading Kazuo Ishiguro’s Never Let Me Go.

The Anatomy of Fascism

The cover of Robert. O. Paxton's The Anatomy of Fascism

In the introduction The Anatomy of Fascism, Robert O. Paxton notes that most scholarship on fascism remains narrowly focused on individual fascist movements. But where these studies offer excellent insight into Mussolini’s Italy or Hitler’s Germany, they don’t offer a better understanding of fascism as a particularly 20th century political phenomenon. This book, he says, is an attempt to bring those insights together in one comprehensive examination of fascism — the movements headed by Mussolini and Hitler, yes, since those were the two most successful examples, but also those in Hungary, Spain, and, yes, the United States.

So what is fascism? Paxton organizes the book roughly following the life-cycle of a fascist movement from how they begin and take root to exercising power and collapsing, but defers a succinct definition until the final chapter.

It is not the particular themes of Nazism or Italian Fascism that define the nature of the fascist phenomenon, but their function. Fascisms seek out in each national culture those themes that are best capable of mobilizing a mass movement of regeneration, unification, and purity, direct against liberal individualism and constitutionalism and against Leftist class struggle.

“Fascism” has its roots in Italian “fascio” (bundle or sheaf) and can be traced to the latin “fasces,” an axe bound by a bundle of rods carried by Roman lictors (guards who accompanied magistrates) that represented both the violence and restrained violence of the Roman republic. In fact, Paxton notes, the republicanism was so important to the symbolism that leftists movements who wanted to restrain the oppression of the aristocracy and the church, in which context “fascio” was used to refer to militant bands. However, in 1919, a new movement in Milan led (at least in part) by a journalist and former soldier named Benito Mussolini adopted the name “Fasci di Combattimento” and declared war on socialists on whom they blamed the problems of the country. Thus was born first named fascist movement in the modern sense.

Paxton frequently reminds his readers that each fascist movement conforms to its native conditions, but there are nevertheless repeated characteristics and preconditions. In each case, fascist organizations were right-wing movements born at times when the country was (or was a thought to be) in decline. These movements, like the two most famous in Germany and Italy, took advantage of the apparent crisis to stoke popular outrage with appeals to nationalism and former glory, thereby further destabilizing the country and presenting themselves as the only path to stability and prosperity.

Where they succeeded, it was because mainstream conservative elites bestowed political legitimacy on them in the name of thwarting their socialist and leftist opponents during times of economic crisis. Thus, Mussolini’s fabled march on Rome might have been a fatal mistake except that the King Victor Emmanuel III refused to empower the Prime Minister to stop him. (Victor Emmanuel would ultimately also depose Mussolini toward the end of World War 2.) The German example is somewhat more commonly known, where Hitler won just enough political support that he had leverage in his negotiations with the Weimar elite, ultimately getting appointed Chancellor with Franz von Papen, a prominent Weimar politician, as vice-Chancellor—only for the combination of President Paul von Hindenburg’s death and the crisis of the Reichstag Fire removing the restrictors from Hitler’s authority.

Although fascist states often get a reputation for being efficient systems — Mussolini made the trains run on time; Thomas the Tank Engine is a fascist utopia, etc — Paxton shows that this is a mirage. In fact, fascist states amounted to an amalgam of power struggles, between the leader whose personal charisma was essential for the party’s rise to power and the rest of the party, between the party and the civil service (which they largely defused by giving civil services autonomy to continue their work), and between the goals of their non-fascist allies.

Other than the varied origins of the fascist movements, the most interesting part of The Anatomy of Fascism to me was its end-point. Paxton identifies two possible outcomes for a fascist movement: radicalization or dissolution into generic authoritarianism. The extreme promises made during the rise to power preclude “comfortable enjoyment of power.” In one scenario, the fascist movement runs out of steam, but members of the party are able to keep hold of the levers of power as run of the mill authoritarians, the difference being that the fascist movement specifically appeals to the emotions of a broad segment of the population in order to fuel its rise to power. On the other extreme, the movement becomes ever more extreme in pursuit of its promises until the situation dramatically changes, as in the Holocaust and World War 2.

Reading The Anatomy of Fascism in the United States 2021, the obvious question is what it might say about modern political developments and, in particular, the presidency of Donald Trump. Paxton is absolutely clear that the United States has had fascist movements in the past, and not just America First and the other Nazi sympathizers in the 1930s. However, he confidently states that, as of 2004, the United States had resisted making them mainstream:

Much more dangerous are movements that employ authentically American themes in ways that resemble fascism functionally…Of course the United states would have to suffer catastrophic setbacks and polarization for these fringe groups to find powerful allies and enter the mainstream. I half expected to see emerge after 1968 a movement of national reunification, regeneration, and purification directed against hirsute antiwar protesters, black radicals, and “degenerate” artists…Fortunately I was wrong (so far).

I am still mulling over a lot of these questions in light of what Paxton wrote, but I have four broad thoughts at this point:

1. I was not wholly convinced by Paxton’s treatment of Fascist and pseudo-Fascist movements in the United States. He gestures to a long tradition of nativist agitation, including the 1850s Know-Nothing Party and iterations of the KKK as evidence for its presence, but concludes that these groups never truly went mainstream. Setting aside that the KKK went through several discrete iterations, Paxton doesn’t account for the fact that these ideas did go mainstream, even without direct fascist agitation. Perhaps the widespread support of these ideas in the form of Jim Crow legislation and immigration controls disarmed them as fascist talking points, but that’s worse.

2. The idea that the United States can succumb to a fascist dictatorship has been the premise of novels since at least 1935 when Sinclair Lewis published It Can’t Happen Here. More recently, Philip Roth wrote The Plot Against America, which David Simon turned into an HBO series, which I wrote about favorably here. Though my current thinking about The Plot Against America isn’t as positive now as it was in that write-up, I do think Lewis and Roth are correct about one thing in particular. My fear is that the American two-party system makes it, if anything, more vulnerable to Fascism than a decentralized European parliamentary system. In the latter, it required various alliances to bring fascists into the mainstream while the former offers one of the two parties not merely as an ally, but a vehicle.

3. When talking about fascism and American politics there is a problem with labels. Calling an opponent a fascist is a way to discredit them and shut down debate, and rarely has anything to do with historical debate. Paxton several times invokes Orwell’s dictum that American fascism is not going to look like Hitler because it is going to wear authentically American clothes. This gets at the root of the issue. Knowingly or not, Trump’s campaigns ran plays from the fascist playbook: the rallies, the obsession with national decline, the appeals to family values, the framing of the world entirely in terms of allies and enemies. Historical reductivism is not a useful exercise and a lot of those traits have deep roots in American society without the presence of self-identified fascists, though we certainly have those, too. The Republican Party also reoriented itself to accommodate Trump who became their charismatic leader, but too narrow a focus on Trump also misses the evolution of the Republican party that has sought to sow mistrust in government since the 1970s. Was Reagan a fascist, then? Most people would say no. Was Trump a fascist? That’s a question without a productive answer.

4. For as much as I believe there is coordination in talking points between Republican party leaders and at least some of the right-wing media in the United States, it is striking the extent to which driving force of nationalist rhetoric in this country comes from media personalities rather than from the party. Trump was a little bit different before his ban from social media, but even in that case there was a feedback loop between the two. While Paxton might point out that the party unity in the fascist movements was mostly a creation of propaganda, they were nevertheless able to control that message. In the United States context, much of the nationalist fervor has been stoked by…television executives funded by billionaires? …talking heads? …agitators whose primary business is selling supplements? This is not to say that Republican politicians don’t make these statements, but, other than Trump, they seem better able to capitalize on the effects of the rhetoric than to actually fan the flames themselves. Offloading the rhetoric onto a third party also makes it easier to manipulate the system behind closed doors through voter restrictions and stacking the judiciary.

In sum, The Anatomy of Fascism is a good book to think with. Paxton might not be able to offer answers to every question, but this book provides exactly what he promises: a wealth of historical context that transcends a narrow focus on Germany and Italy in the 1930s.

ΔΔΔ

I recently reread Kitchen Confidential in advance of seeing the new documentary about Anthony Bourdain. I love this book, even if it isn’t quite as magical as on my first read. I also finished Sally Rooney’s Conversations with Friends, which I picked up because I have read how her books are beloved of critics. This book, told from the point of view of a bisexual college girl Frances who is close friends with her ex Bonni and strikes up an affair with Nick, the husband of the writer Melissa who profiles Frances and Bonni for their poetry performances, traces the intimate web of relationships between these four individuals. It is an intimate and revealing portrait written in a way that makes me understand why Rooney appeals to critics, but I thought that it was a little too assured that its close examination of banal details could lead to profound observations about human relationships.

Empire of Pain

A picture of Patrick Radden Keefe’s Empire of Pain.

Empire of Pain is a story of many grey areas and a bright line in the shape of a little pill. At its heart sits a single family that profited from the pain of millions of Americans.

Anyone familiar with the art world or higher education has heard of the Sacklers. The Sackler Library at Oxford, the Freer Gallery of Art and Arther M. Sackler Gallery at the Smithsonian Institute, the Sackler Wing at the Metropolitan Museum. But, in recent years, the Sackler name has come to be associated with something much more negative: their company Purdue Pharma, its product OxyContin, and the opioid epidemic it helped jumpstart.

Patrick Radden Keefe’s latest book, Empire of Pain, an extension of a New Yorker article on the same topic, documents both sides of the Sackler legacy, examining how this family, the children of Jewish immigrants, made an enormous fortune that was designed to burnish their good names, but then helped create one of the worst public health crises in US history.

Empire of Pain is divided into three parts.

The first part focuses on the first generation of the Sackler dynasty. Arthur, Raymond, and Mortimer were the sons of Jewish immigrants in Brooklyn. All three attended Erasmus Hall High School and became doctors in an era when medical schools put in place severe quotas to exclude Jewish applicants. The oldest, Arthur, had already begun a career in marketing while in high school and paid his way through medical school with a job as a copywriter at the advertising firm William Douglas McAdams, a double career that would come to define his career. After graduating, Arthur pursued a residency at Creedmoor Psychiatric Center where, joined by his brothers, they helped pioneer pharmaceutical approaches to treating mental illness.

However, Arthur also kept up his second career as a medical ad-man, first working at and then coming to own William Douglas McAdams. As if that were not enough, Sackler became a silent partner in L.W. Frohlich, McAdams’ competitor agency founded by his childhood friend, as well as joining his brothers and Frohlich in founding IMS, a medical information company, and the Medical Tribune, a direct-to-physician newsletter that, unsurprisingly, featured numerous advertisements for products repped by McAdams and Frohlich.

In Keefe’s telling, Arthur Sackler was a powerful personality, a tireless font of energy, and a man with numerous and varied tastes that led him to take art classes at Cooper Union. But he also thrived in the grey areas. He made his fortune playing a shell game with advertising, always disguising how involved he was in any given company, to the point that he transferred a large portion of his stake in one to his then-ex wife Else, but continued to freely use “her” funds as he pleased. It was in this context that he purchased for his brothers an old pharmaceutical firm, Purdue Frederick, the maker of small number of staple products like earwax removers and laxatives.

Charitable giving was always part of the plan. The brothers and Frohlich initially agreed that their heirs would receive some money, but once all four died their companies would pass into a charitable trust that would burnish their names. In practice, the charitable giving was more of the same shades of grey. Keefe points out that Arthur Sackler liked having his name on things (so much so that he encouraged his third wife to take his name years before they married), but he always drove a hard bargain. For instance, he persuaded the Met to store his collection of Asian art on his behalf and often managed to defer the actual donations so as to extend the tax benefits of his gifts. In one case, he negotiated that he would purchase the collection of a gallery to at the original price from the 1920s and donate it back to the museum as a way of infusing a little more money to the institution—only to turn around and claim the present value of the gift as a tax write off in a maneuver that might as well be out of Winners Take All. Keefe suggests that Arthur Sackler made money on the transaction.

Arthur and his brothers rode these grey areas into the upper-crust of American society, but as early as the 1960s there were questions about their methods. In 1962, Arthur Sackler testified before a congressional committee chaired by Eses Kefauver that was then looking into the pharmaceutical industry, with particular questions about the ethics of advertising drugs and the process by which companies got their drugs approved. Arthur escaped unscathed, but these two questions remained unresolved.

The second part of Empire of Pain turns to the development of OxyContin in the 1990s (years after Arthur had passed away). The proprietary technology of OxyContin is the time-release coating that allows a powerful dose of opioids to be slowly released into the body. Purdue Pharma, now headed by Raymond’s son Richard, claimed that the slow release of the medication diminished the risks of addiction and thus that this was the perfect drug to address all sorts of chronic pain issues. With this marketing in hand, Purdue dispatched armies of sales reps across the country with a simple mandate: sell as much OxyContin as possible. After all, the clock was ticking until generic competitors would undermine profits. These were the same sales methods that Arthur had pioneered decades earlier, now turned toward a drug made by the family’s company.

Where the first two parts of the story are filled with domineering people who rode problematic practices to wealth, part three turns dark. Keefe uses court documents to show that the Purdue (and the Sacklers’ other company, IMS) were aware of doctors over-prescribing pain medication and all of the ways that the drug could be abused. And yet, Keefe shows, the family to this day denies responsibility—for its false advertising, for its sales-tactics, and for its role in inventing problems to be solved with an addictive substance. Instead, Richard and other company representatives blamed overdoses on the victims, claiming that criminals were the problem, not the company. They thus used an army of lawyers to quash lawsuits, all while refusing to heed calls from within to diversify their portfolio and voting themselves billions of dollars in payouts, leaving the company itself effectively broke.

Empire of Pain is an infuriating book. The standard defense of Arthur Sackler is that he had passed away before the invention of OxyContin and thus it is the responsibility of Raymond, Mortimer, and particularly Richard, who was then in charge of the company. This is the same claim made by the younger generation who insist that they be judged by their movies or actions without consideration of the family firm. Keefe’s argument though is that this was a family firm. Arthur’s methods of interacting with the FDA and marketing bled into Purdue pharma, and the money then came out of Purdue pharma and into the wallets of the younger Sacklers. There are some differences between the generations, sure, but Keefe suggests that this is built on wishful thinking—Arthur was in the analgesic business before his brothers were.

But the question of blame is only one facet of why I found this story infuriating. This is in fact the third book in the last two decades to make this connection, on top of the mountain of court filings. Rather, it is the sum total that makes it so frustrating: he grift, the marketing, the failures of oversight, the pain it wrought, and the lengths they went through (to say nothing of the millions of dollars they spent) to deny responsibility. The Sackler family is correct that they are not the only ones profiting from the sale of opioids and that the opioid epidemic goes far beyond Purdue pharma, but it is also hard to deny Keefe’s conclusion that the drugs and methods they pioneered have had profoundly toxic consequences.

ΔΔΔ

My reading continues practically without interruption. I have also finished Andrea Stewart’s excellent debut novel, The Bone Shard Daughter, which I plan to write about, and Yishai Sarid’s The Memory Monster, which I might not. The latter is a parable about an Israeli tour guide to Holocaust sites in Poland who becomes consumed by the memories of the Holocaust. This novel had a number of barbs, including children on tours saying that they needed to model themselves after the Nazis and do this to the arabs and the narrator’s frustration with how the Holocaust has become symbolic even to the point where people were associating it with Poland rather than Germany and thus forgetting the humans at the camps in all of their complexity, but I found the story itself a little shallow.

I am now reading Megha Majumdar’s debut novel, A Burning.

A World Without Email

Have you not received emails flow chart, from PhD Comics.
Original Comic

Email is a brilliant tool. It takes virtually no effort or time to send an email that conveys a bit of information to one or more recipients almost anywhere in the world. They can then respond at their own pace, creating a thread that records how the conversation unfolded.

But email is also awful, a never-ending stream of small bits of information that can cause important tasks to get lost in the deluge.

I receive a relatively small amount of email compared to a lot of people, but I realized a few months ago that one of the great hidden costs of adjunct teaching at several different schools is that it dramatically increases the amount of necessary email management. For the past year or so, I managed three or four professional accounts on top of my personal one that I use for work unrelated to my academic employment. This work only requires reviewing an email, determining if it demands a response, and then deleting it, but now repeat the process for multiple accounts several times a day.

Then there are the email conventions. Email should allow for intermittent correspondence, but it has become practically an extension of instant messenger and group-think of lengthy email threads encourages people to engage in lengthier and lengthier responses that often defer the responsibility for actually making decisions. When the chair of a committee I am serving on needed to finalize a proposal, she skipped the email threads and asked several people who had responded to a pre-circulated draft to just sit down on a Zoom meeting and iron out our submission. In an hour, the three of us finished what could have dragged on indefinitely across email.

These are exactly the problems that Cal Newport tackles in his A World Without Email. His basic argument, which is an extended version of his “Is Email Making Professors Stupid?” from 2019 in the Chronicle of Higher Education, is that email and other “hive-mind” technologies like Slack are sapping the productivity of knowledge workers in nearly every sector.

The argument goes as follows: these hive-mind technologies were designed with the premise that more, easier communication is always better. You can better stay in touch with clients and customers; managers can better keep tabs on what is happening; workers can quickly get answers to questions. The technologies succeeded. They revolutionized the workplace and offices became increasingly streamlined. And then something happened. Email started to interfere with the smooth functioning of an office. Workers started spending less time doing what Newport terms “deep” work and more time handling managerial tasks like responding to emails and writing lengthy memos. Email allowed more immediate responses to clients, so clients began demanding more access, transparency, and immediate responses. Workers now able to check with a manager before making any decision did so, further bogging down processes and anxiety increased.

According to Newport, a computer science professor at Georgetown University, the problem is that these hive-mind technologies are actually too efficient. It is too easy to fire off an email, passing off responsibility for a decision or keeping everyone in the know. But that ease comes with an asynchronous cost. It usually costs little for the sender to send an email, but a lot for the recipient to wade through dozens of low-effort emails.

(In cases where there is a wide power differential and the sender is unsure of how their missive will be received are, of course, an exception.)

The Average time Spent Composing One E-Mail. Professor: 1.3 seconds; Grad students 1.3 days.
Original Comic

The flood of emails or other messages is likewise as distracting as the never-ending stream of updates from social media, taking our eminently distractible minds away from whatever it is we are working on.

Newport’s solution for these woes is not quite a world without email — that is a utopian impossibility — but to get as close to that as possible by putting in place systems that allow for asynchronous collaboration and communication without requiring an immediate response. Email will continue to exist and serves some important services, but it should be dramatically cut back in both volume and length.

A lot of Newport’s ideas come from and are tailored to the startup world, but they have a lot of crossover applicability to higher education (which is still my field).

For instance, Newport gives examples of employers who shortened the workweek contingent on the employees being able to dedicate their entire time on the clock actually working or structuring schedules where some or all employees are not responsible for email until after lunch. They key, he argues, is about setting and holding to expectations. If a project manager is the contact person for an entire project, there simply is no way to contact them by email. Better yet would be a centralized project board where anyone who needed an update on what was happening could simply look. If the system uses short daily (or weekly) in-person meetings to give updates, then the query can wait until that meeting. Any such system, Newport argues, would require empowering workers to make decisions within their purview, but will create better outcomes long-term.

I don’t do most of my work in a collaborative workspace like the ones Newport describes here, but many of these same principles apply. Take my daily writing time. I can have minimal distractions (animals, the bustle of a café, music), but nothing narrative, no discussions, and certainly not the digital updates. For those blocks of time, usually an hour but sometimes longer, I turn off my social media, close my email, and tune out the world. Anything that arrives while I’m writing can wait.

Other suggestions in A World Without Email are more directly applicable.

One example: the “scrum” status meeting . These meetings happen several times per week and are held standing up to encourage brevity. At each meeting, the team members answer three questions: (1) what did you do since the last scrum?; (2) do you have any obstacles; (3) what will you do before the next scrum. If a team member needed a longer meeting, it could be set at this time. Newport describes the scrum as an ideal way to manage an ongoing project in a company, but I could see using a modified version (maybe twice a week instead of daily) with students working on theses and independent projects. These projects are usually developed with long regular one-on-one meetings, but the result is siloing the educational process and adding significant time commitments to a weekly schedule. By contrast, a scrum might show the students that they are not working on these things in isolation, the regular contact builds low-stakes accountability, and making these standing meetings cuts down on scheduling emails.

Newport also argues for automating and outsources as many processes as possible in order to save time that could be better spent doing deep work — or no work. Sometimes this requires money, such as how he describes hiring a scheduler or administrative assistant to handle tasks that might not be in your wheelhouse. I appreciated this suggestion, even if it struck me as analogous to how many basic necessities in life are cheaper if you’re able to afford to spend a larger total amount up front by buying in bulk.

More relevant to my position was the suggestion to automate as many tasks as possible.

At the end of the most recent semester I floated an idea to use flex due-dates for major assignments in my classes, but had been thinking about how to actually administer the policy without a flood of emails. The answer, I think, is creating automated systems. My current thought is to create a Google Form for every major assignment, with link embedded on the assignment guide and on the course website. To receive an extension on that assignment, all you have to do is fill out the form before the due date, answering just a couple of questions: name, assignment, multiple choice for how long an extension you want, and maybe a brief explanation for if you selected “other.” Rather than collect however many emails to respond to, I will have all of the information for each assignment in one place. Likewise, even if I return to grading physical papers, I will request two submissions, an online back-up that counts for completion, but then physical copy that can be turned in the following day for grading. Each of these policies requires a small additional step at set-up, but could streamline the actual process, and I hope to find other processes to similarly automate in my day-to-day job and also should I find myself leading a committee.

My only major of the book is mostly a function of the intended audience. My issue was with how Newport framed productivity as an abstract but ultimate ideal. This led to consequences in the text that run crosswise to what he is actually arguing. At one point Newport talks glowingly about an obsolete office setup where secretaries handled mundane tasks like scheduling meetings, transcribing memos, and handling routine communications. His point is that removing these tasks frees the knowledge worker to do deep work (that they are being paid for), but the value to that worker is given significantly more space than are the mechanics of hiring at a fair wage to do the job. He believes the latter (or says so in the text), but mentions it only in passing. Likewise, the value of deep work, Newport argues, is that you can reject the pressure to work exceedingly long hours, but the focus is on how to produce more. I understand why he wrote the book this way, but given the long-term trends that show how productivity has vastly outpaced wages, I’m not convinced that productivity out to the be the primary objective and thus found the evidence for improved workplace satisfaction to be a much more compelling case for cutting back on email use.

A World Without Email is a manifesto, but a timely one that has given me a lot to think about going into my new position since a new beginning is a great time to implement the new processes and protocols that he suggests.

ΔΔΔ

This post flitted between one where I think about academia and where I write about books, so I might as well continue here. I just finished Andrea Stewart’s excellent debut novel, The Bone Shard Daughter, and am looking forward to starting Patrick Radden Keefe’s Empire of Pain next, an investigation into the Sackler family and the opioid crisis.