Learning to Run Again

This morning I woke up before my alarm. I grabbed my phone to turn that alarm off and checked a few things before getting out of bed. Then I puttered around the house, reading a novel and stretching by turns for a little more than an hour, just long enough to steep and drink a big mug of tea.

Then I laced up my running shoes and set out.

My current bout of running came on about a month and a half ago. I have never been as serious or successful a runner as my father and brothers who for a number of years now have run marathons together, but this is not my first time running. In high school, I would go for runs with my father and ran a few local 5k races. Early in graduate school I tried running again. It was during this period that I reached my longest distances, running about five miles at least once a week and topping out at about eight miles before running into a leg injury. I tried a “run the year” challenge a few years ago and contributed 173 miles to my team’s total, including a few miles when I couldn’t sleep early in the morning while on a job interview. Then injuries. I tried again after the pandemic closed the gym where I exercised. My last attempt, shortly after moving last summer (and, in retrospect, after holding my foot on the accelerator of a moving truck for many hours), ended abruptly with sharp pain in my lower calf less than a quarter mile into a run.

I am a slow runner, particularly these days. I am also not running very far—just a little under two miles today. But this is okay. My focus right now is on form. On my gait, and trying to keep it in line with how I imagine I run barefoot since I have suffered far more injuries while running in shoes than I ever did playing ultimate barefoot, which I did into my 30s. Correlation need not be causation, but so far, so good. I am running slow and careful, and celebrating ending each run for ending uninjured rather than for reaching a particular distance or speed. Those will come, but only if I can stay healthy.

I like the idea of running more than I actually like running. Rather, I would like to like to be someone who likes running, who achieves that runner’s high, who runs an annual marathon. But I spend my runs thinking about how everything hurts and, recently, fretting about whether this footfall will be the the one when something gives out and I have to start over. I can also only compete against myself while running, and pushing myself this way is exactly what I’m trying not to do.

By contrast, I used to play basketball for hours every week. My slowness didn’t matter as much in a confined playing surface where I could change speeds and understand the space. And since I didn’t like to lose, even in a silly pick-up game, I could just lose myself in the game and not think about what hurt.

And yet, running is what I have right now, so running is what I’m doing alongside a daily yoga routine.

My return to running also prompted me to finally pull Christopher McDougall’s Born to Run off my to-read shelf. McDougall describes himself as a frequently-injured runner, so I thought it might unlock the secret to running pain-free. In a way, it might have.

The centerpiece of Born to Run is a 2006 race in Copper Canyon in the Sierra Madre Mountains between a motley crew of American ultramarathon runners, including Scott Jurek, one of the best in the world at the time, and some of the best Rarámuri (Tarahumara) arranged by a mysterious figure called Caballo Blanco (Micah True).

(The race went on to become an annual event, though its founder died in 2012.)

It is an incredible story. Rarámuri runners had made their appearance in ultra-marathon circles at the Leadville 100, a high-altitude ultramarathon in Colorado, in 1993 and 1994. A guide and race director named Rick Fisher rolled up to the race with a team of Rarámuri for whom he was the self-appointed team manager. The Rarámuri runners won both years, setting a new course record in the second race, before deciding that putting up with Fisher’s actions wasn’t worth their participation.

(An article from 1996 in Ultrarunning about a race in Copper Canyon in which True also participated acknowledges Fisher’s “antics,” but points suggests that they didn’t end his relationship with the tribe.)

However, this story is the hook. Born to Run is an extended argument for a minimalist running style that exploded in popularity following its publication. McDougall’s thesis is that modern running shoes, and the industry that is predicated on selling those shoes, causes us to run in ways that cause injuries. This argument is somewhat anecdotal, relying on personal experience and stories of incredible endurance from athletes before the advent of running shoes.

The Rarámuri, whose name means “The Running People,” are exhibit A. The Rarámuri are a tribe that lives in isolated villages deep in the Sierra Madre Occidentals, in the Mexican state of Chihuahua. The terrain makes long-distance travel a challenge, so they Rarámuri run. But they also run for ceremony and sport in a ceremonial ball-game called rarajipara where teams work to kick a ball an agreed upon distance, chasing it down after each kick. All the while, runners wear just a traditional sandal called huaraches.

My own experience with running makes me sympathetic to McDougall’s argument, and I am seriously considering getting a pair of zero-drop shoes and transitioning in this direction for my footwear. However, the more I read about running injuries, the more it seems that the answers might be more idiosyncratic. That is, there is a lot of conflicting evidence. While some studies suggest physiological advantages to barefoot running, others point out that not all barefoot runners run with the same gait. A number of studies suggest that barefoot running has shifted the types of injuries (aided perhaps by people transitioning too quickly) rather than reducing them. I think that barefoot running could be good for me, but all of this makes me think that I shouldn’t ditch the running shoes for every run just yet.

While I was reading Born to Run, a friend suggested that I read Haruki Murakami’s What I Talk About When I Talk About Running, which connects my current focus on running with my ongoing obsession with writing.

In addition to being a novelist, Murakami is a marathoner and triathlete who describes how his goal is to run one marathon a year. This memoir is a collection of essays on the theme of running and training, and, unlike Born to Run, is not meant to be an argument for a particular type of training.

I think that one more condition for being a gentleman would be keeping quiet about what you do to stay healthy.

Nevertheless, I found What I Talk about When I Talk About Running to be particularly inspiring. Murakami is a more successful runner than I ever expect to be, even though I’m only three years older now than he was when he started running. And yet, I found something admirable about his approach. Running, like writing, is just something Murakami does, and he doesn’t think about a whole lot when he is on the road. His goal in running is to run to the end of the course. That’s it. He gets frustrated when he can’t run as fast as he used to, but he is not running to beat the other people, and uses the experience to turn inward.

And you start to recognize (or be resigned to the fact) that since your faults and deficiencies are well nigh infinite, you’d best figure out your good points and learn to get by with what you have.

But it should perhaps not come as a surprise that I highlighted more passages about writing than I did about running, though Murakami makes a case that the is broad overlap in a both a running temperament and a writing one. Both activities require long periods of isolation and where success is not synonymous with “winning.” Doing them is more important than being the best at them.

I don’t think we should judge the value of our lives by how efficient they are.

A useful reminder.

ΔΔΔ

I have had a hard time writing about books recently. Before these two books, I got bogged down in Olga Tokarczuk’s The Books of Jacob, which I am still trying to process, and then read Ondjaki’s The Transparent City, which is a very sad story about an impoverished community in Luanda, Angola. I would like to write about these, but I’m not sure that I have anything coherent to say and June has turned much busier than I had hoped—last week I was at AP Rating in Kansas City, then I wrote a conference paper that I delivered yesterday, and now I’m staring down a book deadline and other writing obligations. By the time I have time, I might be too far removed to come back to those books. I am now reading Christine Smallwood’s The Life of the Mind, which is a novel about adjunct labor and miscarriage in a way that highlights the lack of control in both situations.

The End of Burnout

Many authors tell people who already feel worn out and ineffectual that they can change their situation if they just try hard enough. What’s more, by making it individuals’ responsibility to deal with their own burnout, the advice leaves untouched the inhumane ethical and economic system that causes burnout in the first place. Our thinking is stuck because we don’t recognize how deeply burnout is embedded in our cultural values. Or else we’re afraid to admit it. Insofar as the system that works people to the point of burnout is profitable, the people who profit from it have little incentive to alter it. In an individualistic culture where work is a moral duty, it’s up to you to ensure you’re in good working order. And many workers who boast of their hustle embrace that duty, no matter the damage it does. In a perverse way, many of us love burnout culture. Deep down, we want to burn out.

I resemble this statement, and I don’t like it.

By the definitions established in Jonathan Malesic’s recent book The End of Burnout, I have never burned out—at least not completely. I have never reached a point of absolute despair that rendered me incapable of going on, which, along utter exhaustion and reduced performance, marks burnout. The other two, however…

I wouldn’t say that I worked hard in high school, at least on the whole. There were projects that I worked at and if something interested me I would work hard, but not so much overall. Midway through my undergraduate career something snapped. Seemingly overnight I became a dedicated, if not efficient student. I divided everything in my world into “productive” activities and unproductive ones and aspired to spend my waking time being as productive as possible. School work obviously counted as productive, but so too did exercise and investing time in my relationships. Spending time not doing things was deemed unproductive.

At first this was innocuous enough. I was young and productive time included fun things, right? My numerous and varied interests led to me to do all sorts of things and I was determined to do them all. By the time the second semester of senior year rolled around this was almost a mania: I was working, running a club, taking a full course load, working on two research projects, and auditing extra classes that just looked interesting to me, as well as exercising and generally spending time on the aforementioned relationships.

At a time when the stereotypical college student develops a case of senioritis, going through the motions while looking forward what was next, I somehow managed to define sleep as “not productive.”

Seriously.

I cringe thinking about it now, but I went through most of a semester averaging about three hours of sleep a night. I don’t think I ever pulled an all-nighter, but most nights I only got one or two hours, going to bed around midnight, getting up at 1:30 so I could grab coffee and food before the late night place closed, work until the gym opened, exercise, shower, go to class, and then either go do homework or go to my shift at work. I would get eight hours or so on Fridays after work and whatever recreational activities I had planned. Several people that I know of had conversations about when I was going to collapse, though not within earshot. It was bad. Trust me when I say that you shouldn’t do this.

According to the journal I kept at the time, under an April entry titled: “I guess I did need to sleep,” I slept for 13 hours straight.

I have never done something this self-destructive since, but there have been numerous times that I have edged in that direction.

  • The year after college I ended up working up to 90 hours a week, often for weeks at a time without a day off until I just couldn’t physically keep it up, at one point sleeping for more than 12 hours and forcing myself to take days off, even if the nature of the job made that difficult.
  • I worked almost 30 hours a week on top of my school responsibilities (a “full” course load and grading for a class) while completing my MA.
  • I nearly lost snapped while completing the work for one of the toughest seminars I took in grad school the week that I was also taking my comprehensive exams.
  • Another semester, while cobbling together jobs as an adjunct, I took on so much work (six classes, one of which was nearly twice as much work as I thought when I accepted it) that I had to stop writing entirely just to stay on top of the teaching.
  • The semester after that I developed (probably anxiety-induced) GERD and broke out in hives.
  • I frequently have to remind myself that taking one day off a week is okay, leave alone two. At least I usually sleep 7–8 hours a night these days.

Lest it sound like I’m bragging, these are not badges of honor. They are symptoms of the perverse relationship with work that Malesic describes, wedded with ambition and an anxiety oscillates between imposter syndrome and a deep-seated fear that I’ll once again become someone who does nothing if I let up even a little. The worst part: my behavior place within systems that celebrate discipline, but it was almost entirely self-inflicted.

However, I have never burned out like Jonathan Malesic.

Malesic had achieved his dream of becoming a tenured professor of religion and living the life filled with inspirational conversations with young people that he imagined his own college professors had lived. But that life wasn’t as great as he imagined. His students were apathetic, the papers uninspired and, at times, plagiarized. There were meetings and committees, and his wife lived in a different state. In short, the job didn’t live up to his expectations, which, in turn, caused his life to fall apart. His job performance lagged. He snapped at students. He drank too much and found himself incapable of getting out of bed. And so, eventually, he quit.

The End of Burnout is an exploration of the forces that caused his disillusion with his job and possible solutions to escape it. Put simply, Malesic’s thesis is that two features of the modern workplace cause “burnout.”

  1. People derive personal meaning and worth from their jobs.
  2. There is a gulf between the expectations and reality of those jobs.

That is, there is a broad expectation in the United States that your job determines your worth to society. This is obviously not true, but it is signaled in any number of ways, from making health insurances a benefit of employment, to looking down on “low status” jobs like food service, to the constant expectation that you ought to be seeking promotion or treating yourself like an entrepreneur. But if your worth is wrapped up in your job, then you might enter with a certain set of expectations that are out of sync with the conditions—doctors who want to heal people and end up typing at a computer all day, or a professor who got into teaching because of Dead Poet’s Society and ends up teaching bored, hungover students in general education classes. On top of it all, the responsibility for “solving” the issue is then passed on to the worker: you’re just not hustling hard enough. Have you tried self-care?

The End of Burnout is a thought-provoking book. Malesic examines the deep historical roots of phenomena that might today be called burnout, discusses the pathology of an ambiguous phenomenon that is likely overused, often pointing to acute exhaustion rather than true burnout, and explores how social pressures (e.g. the moral discourse that equates work with worth) exacerbate the phenomenon before turning to alternate models of work and human dignity.

I picked up the end of Burnout for a few reasons.

Most obvious, perhaps, is my toxic relationship with work, as outlined above, to the point where I thought that I had burned out on multiple occasions. Based on the descriptions Malesic provides, I was usually acutely exhausted rather than truly burned out, with the result that, at least so far, I have always been able to bounce back with a few weeks or months of rest.

(The one exception might be the restaurant work straight out of college, but even that did not stop me from working in another franchise in the same chain for two more years while attending school.)

Cumulative exhaustion can lead to burnout, but I came away unconvinced that I have even really been walking down that path. I have been frustrated, of course, and can tell that I am creeping toward exhaustion when I start excessively doom-scrolling on Twitter, but I did not relate to the sheer disillusionment Malesic described. When I have considered other employment options over the past few years, it has always been because of a dearth of jobs.

The main difference, at least to this point, is that I have never viewed this job through rose-colored glasses. Writing about history is something I see as a vocation, but I have approached the teaching and associated work as a job, albeit one that aligns with those other aspects of my life and thus is more enjoyable than some of the others I have had.

At the same time, I have noticed a shift in my relationship to hustle culture now that I am in my mid-30s. I still work hard and have certain ambitions, but increasingly, they are around finding ways to spend my time reading, writing about things I find interesting and important—and having employment with enough security, money, and free-time to do that.

Likewise, the idea of treating oneself as an entrepreneur, which Malesic identifies as an element connecting worth to employment, has always left a sour taste in my mouth. When people tell me that I could (or should) open a bakery, I usually shrug and make some polite noises. I have managed a restaurant in my life and have very little interest in doing so again. I bake because I like the process and enjoy cooking for people I like, not because I want to turn it into a business with all of the marketing, bookkeeping, and regulations that would entail.

(I have also considered trying to turn my writing into a subscription business, but I find that incompatible with the writing I do here. If I made a change, it would involve some sort of additional writing with a regular and established schedule—say, a monthly academic book review for a general readership with a small subscription fee designed to cover the cost of the book and hosting. A thought for another day.)

However, I also picked up The End of Burnout because I am worried about the effect that this culture has on my students. Nearly every semester I have one or more students who report losing motivation to do their work. This past semester one student explained it as a matter of existential dread about what he was going to do with his degree, but it could just as easily be anxiety or concern over climate change or the contemporary political culture or school shootings.

I have long suspected what Malesic argues, that burnout is systemic. In a college context, this is why I get frustrated every time a conversation about mental health on campus takes place without addressing those systemic factors. Focusing on the best practices and workload for an individual class is (relatively) easy, but it is much harder to account for how the courses the professor is teaching or the students are taking interact with each other. I am absolutely complicit in this problem. One of my goals for next academic year is to reexamine my courses because the reality is that the most perfect slate of learning assessments is meaningless if the students end up burned out. I can’t fix these issues on my own, but Malesic’s book brought into greater focus why I need to be part of the solution for my own sake and my students’. I don’t ever want to let one of my students make the mistakes I did when I was their age, which probably explains why the most common piece of advice I give is “get some sleep,” and I can’t help them if I am also in crisis.

The back part of The End of Burnout turns to possible solutions. Perhaps unsurprisingly given his background as a professor of religion, this discussion frequently focused on groups with a Christian bent. He spends a chapter, for instance, talking about how various Benedictine communities apply the Rule of St. Benedict to tame the “demon” of work. Some groups strictly follow the Rule, limiting work to three hours so that they can dedicate the rest of their lives to what really matters, prayer. Other groups, like several in Minnesota, were less rigid, but nevertheless used similar principles to divorce work and worth, and allowing one’s service to the larger community change with time.

The other chapter in this section was more varied, and included useful discussion from disability activists, but it also featured a prominent profile of Citysquare, a religious-based Dallas non-profit that uniquely humane policies around work expectations and support for its staff. These examples sat awkwardly with my agnostic world view, as someone who believes that we should be able to create a better society without religion, and particularly without Christianity. However, Malesic’s underlying point is not that we ought to all follow the Rule of St. Benedict. Rather, he makes a case that each profile in its own way can help imagine a culture where the value of a person is not derived from their paycheck (or grade).

To overcome burnout, we have to get rid of the [destructive ideal of working to the point of martyrdom] and create a new shared vision of how work fits into a life well lived. That vision will replace the work ethic’s old, discredited promise. It will make dignity universal, not contingent on paid labor. It will put compassion for self and others ahead of productivity. And it will affirm that we find our highest purpose in leisure, not work.

Malesic’s vision here is decidedly utopian and hardly new, and his warnings about the consequences of the automating workplace are a modern echo of 19th century choruses. But the ideals he presents are worth aspiring to nonetheless. As long as we work within a depersonalizing, extractive system that treats people as interchangeable expenses against the company’s bottom line, then that system will not only continue to grind people down and spit them out, but also contribute to nasty practices elsewhere in society like treating food service workers with contempt. Severing the connection between personal worth and paid work won’t solve every problems, but it is a good place to start.

Bring the War Home: The White Power Movement and Paramilitary America

Black and white image of the cover of Kathleen Belew’s Bring the War Home.

On January 6, 2021, a crowd people stormed the US Capitol Building in order to stop the certification of the electoral votes that made Joe Biden president. This was the result of actions meant to undermine faith in election and polarization heightened by the present media ecosystem, but it was also the culmination of decades of growing extremism among white nationalist and anti-government militia movements. That growth is the subject of Kathleen Belew’s Bring the War Home: The White Power Movement and Paramilitary America.

While there has been a pronounced strain of separatism in the United States as long as there has been a United States, Belew identifies the modern iteration in the resolution to the Vietnam War in the 1970s. White power was at the heart of the militia movement from its inception, but she argues that the perceived betrayal in Vietnam prompted a very specific metastasis beyond bog-standard racism. It prompted people like Louis Beam to form militia groups with the stated intent of continuing the war. Naturally, they found common cause with groups like the Knights of the Ku Klux Klax that David Duke founded in 1975.

In these early ears, the militia movement claimed to be fighting against insidious forces and on behalf of the United States. They were soldiers taking the war into their own hands. However, Belew traces how this resentment and frustration transformed over the course of the 1980s until their orientation had turned 180 degrees. By the start of the 1990s militia groups operating around the country–and not merely at places like Ruby Ridge–saw themselves as soldiers in a war on behalf of white people against the United States, which they referred to as the Zionist Occupation Government. She concludes with a chapter on Timothy McVeigh and his terrorist attack in Oklahoma City on April 19, 1995, though that incident clearly did not put an end to the movements Belew documents is documenting.

At this point, I feel like I need to offer a caveat. I finished Bring the War Home a month ago and while I take copious notes on the books I read for “work” take only haphazard notes on books that I read for “fun.” This book technically falls in the latter category even though parts of it will undoubtedly make its way into my US history classes. I meant to write this post within a day or two of finishing the book, but it turns out that writing here is a lower priority than, say, my classes or work on academic publications. All of this is to say that the following analysis is going to be more a reflection on what I saw as a couple of key themes and less an actual review.

The first thing that stood out to me in Bring the War Home was how Belew traces multiple loosely-connected organizations joined by a common sense of purpose and sometimes, marriage. The various groups saw themselves as part of the same conflict and Belew shows how they used the early internet to support one another, but the absence of a hierarchy meant that quashing one did nothing to slow the spread of the movement. In fact, efforts by the federal government to address the militia movement in places like Ruby Ridge only galvanized other cells and sympathizers. This part of the book sometimes meant trying to keep track of a web of names, but it effectively highlighted the challenge of addressing the militia movement.

Second, perhaps the most striking chapter in Bring the War Home was “Race War and White Women.” In this chapter, Belew shows how white women were of central importance to the militia movement. That is, they claimed to be defending the virtue of vulnerable white women who, in turn, were expected to bear white children. These vulnerable white women were both an abstract ideal, rather like love interests in D.W. Griffith’s Birth of a Nation, and people who played a concrete role in spreading the militia ideas. In the case of a the Fort Smith sedition trial in 1988 that ended with the jury rendering a not guilty verdict, two of the white women on the jury subsequently entered into public relationships with defendants.

(One of the key witnesses in that trial went on to murder three people at Jewish centers in Overland Park, Kansas in 2014.)

Bring the War Home is a terrifying book in many ways. It brings into focus a strain of extremism in the United States that has been steadily growing in prominence in the past few decades. This movement coalesced around racism, anti-semitism, and christian identitarianism, took advantage of new forms of media new media, and, as Belew put it on the first anniversary of January 6, ruthlessly seizes any opportunity. And yet, while these militia movements have themselves shed blood in their war against ZOG and fully intend to do so again, I can’t help but feel that their presence reveals a bigger and more insidious danger. The militia movement emerged from a specific knot of beliefs, but its growth and evolution stems in no small part from how many people not directly affiliated with any tentacle of the movement express sympathy for their positions. That is, the militia movement won’t win its war through force of arms, but through a steady campaign of radicalization that plays on preexisting prejudices. The fact that their ideas can be found elevated into nearly every level of government demonstrates that it is working.

ΔΔΔ

Crunch time on getting my book together meant giving almost all of my spare time to that, but I have still been reading a little bit every day because it helps me feel normal. Since my last one of these posts I finished Trevor Strunk’s Story Mode, a literary analysis of video games that had some interesting things to say about the evolution of games and Sofia Samatar’s A Stranger in Olondria, which had a gift for rich descriptions of place and with a clever story structure but that I ultimately found disappointing in terms of the characters and how the plot was written, James S.A. Corey’s Nemesis Games (Expanse, book 5), and S.A. Chakraborty’s Empire of Gold. I intend to write about the latter two series at some point. Currently, I am reading Tasha Suri’s The Jasmine Throne.

Generous Thinking

A few years ago I had a student who asked me to write a letter of recommendation for graduate school. She was a shoe-in. Two of the people writing letters for her were the professors she intended to work with, so I was just there to fulfill the requirement. She had taken several classes with me and done well, so I was flattered to be asked and happy to help. When orientation rolled around the next summer, my former student sent me an email to again thank me for the letter I wrote and expressed how nervous she was about the coming semester. I thanked her and gave her my best pieces of advice about graduate school.

It will seem, I said, like your peers know everything. They strut around like peacocks, name dropping scholars and theories and schools of scholarship. But this doesn’t mean that they are smarter or more prepared for graduate school than you are. Maybe they have a deep background in that topic. Maybe they restrict their comments to their particular field of research. Maybe they know just enough to name drop Foucault trusting that you won’t know enough to challenge them.

When I came to graduate school, I was the second-youngest person in my cohort. Where many of my peers had already earned MA degrees or spent years teaching, I had spent my year after graduation managing a Quiznos restaurant and desperately trying to keep my Greek fresh. I was also the only person in my cohort who studied ancient history in a program that was overwhelmingly made up of American historians. This meant that in most conversations I was on their turf.

The best thing you can do, I told my former student, is to resist the temptation to treat graduate school as a competition. Instead, approach the books you read, the classes you take, and the conversations you have with an open mind. Grad school seminars train students to strip books down to their foundations in order to critique the scholarship on everything from the framing to the evidence. These are important skills for a scholar to have, of course, but a more important skill is to understand what the author is doing. Anyone who goes to graduate school can recall an example where a person holding forth on the myriad flaws of a particular book was doing so based on a relatively minor point at best or without having read the whole book at worst.

I have seen both. At least twice I tried to discredit a book based on minor errors—the small issues might be indicative of larger problems, but it was a mistake to not first start with the bigger picture. Another time I watched as someone went on at length about how a book was invalid because it didn’t cover a particular topic…that the author covered in the section of the book that she had not read. Either way, not a good look.

Advice like what I gave to my former student lies at the heart of Kathleen Fitzpatrick’s Generous Thinking. Her core thesis is that the culture of critique and obsession with prestige hierarchies has created an environment where knowledge production is treated like a competition and where tearing down others is as valuable as producing anything. The very structures of the American university system (as distinct from, for instance, community colleges) encourages this behavior:

The entire academic enterprise serves to cultivate individualism, in fact. Beginning with college applications, extending through graduate school admissions, fellowship applications, the job market, publication submissions, and, seemingly, finally tenure and promotion review, those of us on campus are subject to selection. These processes present themselves as meritocratic…in actual practice, however, those metrics are never neutral, and what we are measured against is far more often than not one another—sometimes literally.

The pressures that Fitzpatrick identifies are all exacerbated in the Age of Austerity currently because austerity means even more competition for fewer resources. However, as Fitzpatrick rightly points out, falling back on prestige hierarchies and competition is a self-defeating proposal that undermines the very project we are ostensibly setting out to pursue.

Her solution is to double down on “generosity as an enduring habit of mind, a conversational practice” (56). This means a host of things for Fitzpatrick, from developing a vocabulary of shared values to working in public to realigning the university toward community and public service, to simply learning how to listen.

In principle, I agree with everything Fitzpatrick wrote in Generous Thinking and seek to embody most of the practices.

In practice, I found Generous Thinking frustrating. The subtitle of this book promises “A Radical Approach to Saving the University.” Certainly there is a radicalism in the form of the books optimism and some of the proposals to change university policies away from those that put scholars in competition with one another, but there were times where I also found it to be missing the forest for the trees—by her own admission. Fitzpatrick admits in the preface that this is a book informed by her position at a large land-grant institution. This means a secondary focus on institutions like community colleges, but I found the blindspots to be greater than she admits.

In particular, I found framing a book as a way to save the university but then giving almost no thought to how this would affect contingent faculty shocking. That is, I endorse everything she wrote as a matter of praxis, but I wanted more acknowledgement that many people are not in a position to carry out these proposals. There is absolutely something here that contingent faculty can learn from, but I couldn’t help but feel that in her effort to work toward an academic community built on generosity Fitzpatrick had managed to largely disregard the second-class academic citizen. It isn’t that she us unaware of these problems—indeed, she mentions the jobs crisis on at least one occasion (18) — but other than (rightly, in my opinion) showing how public engagement can help catalyze stakeholders into investing in institutions, I found little meaningful consideration of either how generous thinking would change the underlying structural realities or how this would play out with overworked and underpaid contingent faculty who often already teach more classes than their full-time colleagues while also hunting for their next gig. I hope Fitzpatrick’s suggestions would make a difference and the core ideas absolutely ought to be embraced, but I nevertheless came away with the impression that this was not so much generous, as wishful thinking.

ΔΔΔ

I have a rather lot going on right now. Not only have I hit the point in the semester where I have a never-ending stream of assignments to grade, but I am also working on finishing the manuscript for my first book and keeping up with a few other research and editing projects. This means I am back to often choosing whether to spend my spare time reading or writing about the books I read. For the most part, reading wins out, though I do intend still to write about what I’ve read if at a delay (I finished Generous Thinking almost a month ago). I still intend to write about Yoon Ha Lee’s The Machineries of Empire series and have since finished Maaza Mengiste’s brilliant The Shadow King and Susanna Clarke’s Piranesi, both of which made it onto my soon-to-be-published 2021 list of favorite novels, as well as making my way through Neil Gaimon’s The Sandman, which I will likely write about once I have finished the series. I am now reading Isabel Wilkerson’s Caste, which is an incisive look at the issue of race in America by threading together the US, India, and Nazi Germany.

Eat a Peach

David Chang’s Eat a Peach cover

David Chang is probably best known for his culinary empire Momofuku, which Wikipedia tells me includes at this point dozens of restaurants. I have only eaten at one, the dessert-themed Milk Bar in Washington DC. In Eat a Peach Chang readily admits that everything else that he does—this memoir, his cookbook, Ugly Delicious, and a dozen other endeavors—are designed to put butts in those seats. At least under normal circumstances since, like so many food establishments, Momofuku’s business has been entirely upended by COVID-19.

Eat a Peach, written with Gabe Ulla, is thus an advertisement for Momofuku that puts Chang and his theories of deliciousness front and center. Obviously, food is everywhere—Chang is a chef and his public persona on shows like Ugly Delicious filters the world through food-colored glasses as an heir to the late Tony Bourdain.

But what particularly stood out to me about this memoir is how it is a study in binaries.

Eat a Peach is divided into two parts. Its first half is a roughly linear narrative of his upbringing in a Korean-American household, his successes with golf that helped get him into Georgetown Prep and subsequent flameout of the sport, and his brief period working in finance, before finally getting to his entry into the restaurant industry. Chang readily admits that he was not good at being a chef, which makes his decision to found Momofuku in 2004 and his chance partnership with Quino Baca—the first and only employee at the Noodle Bar when it opened—even more of a radical gamble.

Chang writes about Momofuku like it is a revolutionary movement. There was a vision behind the original Noodle Bar, yes, but there was also a willingness to overhaul the entire menu when things weren’t working. The employees worked in cadres that participated in a company-wide email list with one objective: how to make their product more delicious. As the company grew and expanded, they formed new cells that oversaw Momofuku Ssām Bar and the Milk Bar.

Woven through this narrative is reflection on mental illness and depression (Chang is bipolar) that manifested in self-destructive tendencies such as drug use and overwork.

These themes come more thoroughly to the fore in the second half of Eat a Peach where Chang tells stories from a time after Momofuku and his public persona had become fixtures of the food world. Food and the restaurants still feature, but in more complicated ways.

For instance, in part one, Chang wraps the reader up in the energy and chaos of starting a restaurants—fights with critics and inspectors, problems of staffing, and the thrill of designing the most delicious menu—that captures difficulties, but also sees the enterprise with rose-colored glasses.

By contrast, Chang takes an introspective turn in part two. His ideals remain the same, but now he interrogates where his instinctive “fuck-you” attitude came from, who it is directed toward, and its relationship to his mental health. He talks about his experience with an executive coach who helped him see both how special the thing he created was and how his behavior caused those around him, including customers and staff, to live in fear of his anger. Far from leading a food revolution to bring high-end food to the masses, Chang realized that he was leading a cult. Followers were expected to give up their personal lives and commit their entire beings to the restaurant.

Ultimately, Eat a Peach is a reflection on growth—of the Momofuku empire, yes, but also personal growth in a way that I found particularly satisfying. There were times that Chang’s story resonated a bit too much (my anxiety manifests in a tendency toward overwork as well), but what elevates this memoir for me was how Chang works to de-center himself. He talks lovingly about his wife Grace, his son, and how they learned of her pregnancy the day after his close friend Tony Bourdain died. He lavishly distributes praise for Momofuku’s success. He talks endlessly about his long-standing relationship with his therapist. But more than all of that, I appreciated how Chang talks openly about his mistakes and blindspots, whether in cavalierly dismissing the chefs of California or contributing to a kitchen culture that was hostile to women, and that he acknowledges that talk only goes so far. Proof comes in the form of actions, and it is no coincidence that the cover art is meant to evoke Camus’ Myth of Sisyphus.

ΔΔΔ

I’m still making my way through a backlog of books I want to write about, including N.K. Jemisin’s The City We Became, Kathleen Fitzpatrick’s Generous Thinking, and Yoon Ha Lee’s Machineries of Empire trilogy. I am now reading Maaza Mengiste’s The Shadow King.

Dreadnought

One of the most revolutionary ships in the history of seafaring launched on February 10, 1906.

Just over a century earlier, Horatio Nelson had seized control of the seas for the British Empire by defeating the combined fleets of Spain and France. He did this from the deck of the HMS Victory, a first-rate ship of the line carrying 104-cannons launched a full four decades before earlier. In effect, ships of the line were floating artillery batteries that lined up next to each other and pounded each other into submission. Displacing 3,500 tons and launching a full-broadside of over half a ton of metal, the Victory was not the largest battleship at Trafalgar (the Spanish flagship Santísima Trinidad was larger by nearly a third), but was representative of its age. Effective distances were quite close and Nelson and his fellow British commanders attempted to magnify their firepower through superior seamanship by sailing their ships into close contact before opening fire, even at great cost to themselves—the Victory was practically disabled at Trafalgar, and Nelson fatally wounded.

Naval technology developed through the nineteenth century, with the French navy introducing a steam-powered battleship, Le Napoléon (5100 tons), in 1850 and ironclad battleships starting with Gloire (5600 tons) in 1859. Sail slowly fell out of use, and smoothbore cannons gave way to more powerful rifled guns and explosive shells. By the 1890s most major navies used fully-steam powered battleships of roughly 15,000 tons, with mixed-caliber weaponry, including several batteries of four 10- or 12-inch guns as a main armament, designed to combat threats of various sizes and speeds.

Then, in 1906, the Royal Navy launched the HMS Dreadnought, which, in a stroke, made earlier battleships obsolete. Fifteen years later, the Dreadnought, now obsolete, was sold for scrap in part of the downsizing of navies after World War One.

The Dreadnought was revolutionary in several respects. First, it was enormously large, displacing up to 21,000 tons, with the extra weight coming in large part from its armor. Second, it was fast, with a new steam turbine system that pushed water through the engine to generate steam rather than older reciprocating engines. But most notable was that the Dreadnought only carried a single caliber of main battery, ten 12-inch guns of which up to eight could be fired at once. Each shell weighed 850 pounds, giving the Dreadnought a broadside of 6,800 pounds made up of high-explosive shells capable of hitting a target at a range of more than 15 kilometers. Streamlining the caliber of the armament and centralizing the firing systems also served to increase accuracy because the main batteries all fired at the same elevation and range. In short, this was a superior warship worth two or even three battleships of the type launched even a year before.

Within ten years, the Dreadnought itself had been superseded by battleships built in its image, setting up a clash between the German and British fleets of Dreadnought battleships at Jutland in which the HMS Dreadnought did not participate. However, although the launch of the Dreadnought was a crucial development in the history of naval warfare, it was merely one turning point in a larger story of the naval arms race that led up to World War One.

Puck Magazine 1909, “No Limit” arms race, Wikimedia Commons

Robert K. Massie’s Dreadnought sets out to tell this story, but winds up telling a different, albeit connected, one. While the development of the Dreadnought appears in a pivotal chapter at the center of the book, Massie is much more interested in the personalities involved the naval arms race between Germany and the UK. The result is a book of high politics and biography.

I was mostly familiar with Massie by way of his massive biography of Peter the Great that I read in high school, and individual scenes showed many of the same flairs. Most chapters followed one or more characters, using a mini-biography to chart a particular developments, and Massie works to bring those characters to life with little details like their smoking habits and gustatory tendencies (it is little wonder so many of them suffered from gout). The picture of Otto von Bismarck and King Edward VII smoking like chimneys and Bismarck staring a table full of people down over a plate of pâté are images not likely to leave me any time soon, but the need to paint a new portrait for nearly every chapter also serves to cover a lot of the same ground through each repeated character.

The issue to my mind was that that the high political approach too often put the focus on the arms race between Germany and England as it played out in the halls of Parliament and the German Reichstag and in the personal letters between two royal families. This is not to say it is wholly uninteresting. I was only loosely familiar with the origins of the Boer war, for instance, or just how much of a international incident it became because the German establishment saw it as a war of British aggression, which was a reasonable, if not wholly accurate, interpretation. Similarly, given the seriously extravagant costs of building and maintaining these fleets, explaining how seriously the British government took its mandate of maintaining an overwhelming advantage that served to explain the international arms race and I was fascinated to learn that the day of Franz Ferdinand’s assassination, British battleships were in Kiel on their way to tour Baltic ports.

However, personality-driven approach worked particularly well when exploring the principal characters in the Royal Navy. The middle portion of Dreadnought leading up to the ship itself introduces the reader to the likes of Admiral John (Jacky) Fisher, whose oversight led to the construction of the Dreadnought and sweeping naval reforms, and his arch-rival Admiral Charles Beresford.

In sum, I found Dreadnought to be a highly frustrating book. In part, I went into it hoping that there were would be more, well, boats. Beyond their relative absence, however, there lies a more substantive critique: Dreadnought is frustratingly uneven. Massies’ richly detailed, biographically-centered narrative largely focuses on the building of a bipolar world between Germany and the UK, with other countries generally appearing in the story only insofar as they connect to one of his protagonists. That France, Italy, Austria-Hungary, and other naval powers were building up their own fleets gets mentioned, but is of secondary concern to the “coming armageddon,” while the fact that British companies were constructing Dreadnoughts for the Ottoman Empire gets omitted.

Now, one of the hallmarks of a poor review is to critique an author for not writing the book he or she wanted them to write. I would have preferred a more traditional naval history, either of the Dreadnought as a style of ship that got only about fifteen years of ruling the seas or a social history of the British navy. Massie is telling a different story, however, one that is a more sophisticated spin on the idea of a family rivalry that spurred on a global war. But even as a more sophisticated spin, I found the narrow focus on these two powers is limiting and incomplete. For instance, the discontinuities between the personalities of the British navy on the one side and the German army leading to a discussion of the German navy primarily through the lens of politics on the other led to an imbalance even just between these two powers. To be sure, there was a lot of information packed into this lengthy tomb but I couldn’t help but feel that Massey’s style was better suited to the biography of one or more people than it was to the story of this particular arms race.

ΔΔΔ

I remain better at writing then reading of late, but am still holding out hope that I will write about some of the recent mysteries I have read as well as Kevin Gannon’s pedagogy manifesto Radical Hope. I also recently finished Maja Novak’s bizarre satire about Slovenia’s transition to a capitalist economy, Feline Plague, and have nearly completed Cixin Liu’s Death’s End, the concluding volume to the trilogy that began with The Three-Body Problem. Liu’s trilogy has gotten better as it went along, building out a future history of humanity in the mode of Isaac Asimov’s Foundation series or Olaf Stapledon’s First and Last Man.

Range: Why Generalists Triumph in a Specialized World

David Epstein, Range

My academic research focuses on ancient Greece, but I genuinely enjoy teaching beyond my specialty because my interests are broad an eclectic. I sometimes joke to my partner (who I met in graduate school) that the three areas I considered pursuing for graduate work in history were Ancient Greece, 18th-century naval warfare, and 20th century US diplomatic history. Recently I’ve wandered down rabbit holes into food history and have particularly been enjoying East and South Asian history. The idea of studying just one thing for the rest of my life sounds unbearably tedious and teaching a wide range of classes (or at least varying how I teach World History) is a convenient excuse to read more widely.

I don’t know that my eclectic reading habits or historical interests has particularly improved my scholarship, but it has certainly improved my teaching and writing, and caused the basic tenets of David Epstein’s Range to resonate with me.

Epstein opens with the comparison of Tiger and Roger, two accomplished athletes, one of whom was laser focused from infancy on his sport, the other who played everything except his sport for most of his childhood. Both excelled, but Epstein asks which success was more probable. Despite the intuitive expectation that the person who specialized his entire life (let’s call him Tiger) followed the “better” path, Epstein argues, Roger is a better model to follow. Where Tigers are very good at solving problems within a narrow field with predictable parameters, Rogers can catch up quickly and are are frequently more creative when adjusting to new environments or when facing fields without clearly defined rules.

In short, Epstein makes the case that in a world where an increasing number of well-defined tasks are automated and economic and social pressures push people toward specialization, we should actually be encouraging generalization.

I picked up range after listening to an interview with Epstein where he mostly talked about the value of cross-training, but while there are lessons there, I was a surprised how little discussion of sports there was in the book. Rather, Range is a broad manifesto that talks about everything from scientists and musicians to charity CEOs and game designers. As with many books of its ilk, Range uses concrete examples to offer concrete advice on leadership—promoting diversity, emphasizing communication over hierarchy, empowering employees—as well as useful life advice that taking the time to find your fit rather than locking in early produces better results all around.

In my opinion, though, both the strongest and weakest aspects of the book came down to what it said about education. Granted, as someone in the education field, everything starts to look that way. In addition to several explicit sections on teaching itself, Epstein swipes obliquely supposed outcomes of the education system throughout the book, taking aim at the suggestion that graduates need to specialize early and highlighting the perils of teaching to the test. I agreed in principle with everything Epstein highlights: test performance does not equal learning, efficiency is not a universal good, there is value in struggling to learn something. There are absolutely valuable lessons in terms of how we teach, but I nevertheless came away extremely frustrated with the presentation of education.

For instance, Epstein uses a personal anecdote from his MA thesis at Columbia where he says “I had committed statistical malpractice” because “I had a big database and hit a computer button to run a common statistical malpractice, never having been taught to think deeply (or at all) about how that statistical analysis even worked.” He follows up by quoting a statistician who says that the rush to produce research prohibits metacognition. In short, the specialization and speed interferes with the quality of the work, despite metacognition gaining increased traction in education circles. Similarly, he offers another anecdote about a primary school teacher asking students leading questions when they struggled to come up with the answers. Both of these anecdotes, and another about a professor critical of colleagues who only care about the interesting facts learned from years of increasingly narrow study (albeit while talking about Plato, Aristotle, Hobbes, Marx, and Nietzsche, which shows a certain…range), offer insight into the education system, but, to my mind, not quite what Epstein is going for.

The focus of Epstein’s critique is on the practitioners, rather than on the bad practices encouraged by the culture of credentialism and testing. When the a system requires teachers to prepare students for a standardized test or to publish in academic journals and funnel students into career tracks from early on in college, then the system creates the exact problem that Epstein rightly identifies. Moreover, Epstein makes the case that generalization is good for everyone, but it has the greatest utility for young people because it helps foster creativity, critical thinking, and allows them to find fields that fit their skills.

For as much as aspects of the presentation bothered me, Range is a compelling read. Epstein isn’t against specialization, but makes an important critique of dominant cultural trends that prioritize efficiency and specialization over taking the time to think and reflect across different fields.

ΔΔΔ

I had hoped to finish Viet Thanh Nguyen’s The Sympathizer this weekend, but that was before protests against police violence and institutional racism erupted across the United States and then predictably escalated, often as the result of police action. I spent most of the weekend following local news from across the country.

The Food Explorer

In the second half of the 1800s, at a time when most Americans were farmers, the Department of Agriculture was a tiny outfit mostly charged with discovering ways to make crops more resilient. David Fairchild, the child of an academic in Kansas, joined this small outfit at the same time that the United States was launching itself as an industrial power, with exhibitions such as the 1893 World’s Fair in Chicago. On the advice of a friend, Fairchild applied for a job at the Smithsonian for a position in Naples, resulting in two fateful encounters. First, on the voyage across the Atlantic, Fairchild met Barbour Lathrop, a wealthy and over-the-top globetrotter. Second, on a trip to Corsica, Fairchild stole cuttings from the citron tree.

These two encounters, according to Daniel Stone’s book, revolutionized the American diet. Fairchild believed that the future of American agriculture was the import of new commodities and Lathrop underwrote the creation of this new program when the US government would not because he decided that Fairchild was his preferred traveling companion. Despite its opponents, the food importation program grew both in the number of explorers scouring the globe and in the bureaucracy to manage the imports, and is responsible for a number of the most recognizable products on the produce shelves, including the navel orange and Meyer lemon.

There are a number of interesting stories at work in The Food Explorer, including about the growth of the American bureaucratic state, about the history of food and food safety, and a unique lens on the US and the world, leave alone Fairchild’s biography, but I found it an immensely frustrating book. Part of my frustration came from quirks of Stone’s writing. Some readers might be interested to learn that the walnut is technically a fruit, but I found the persistence in explaining things were fruits rather than whatever their name or common wisdom suggests about as tiresome as people reminding you that tomatoes are fruit. However, there are also a couple of more substantive complaints.

First, The Food Explorer is a book that can’t decide what it wants to be. The main arc of the book is Fairchild’s biography, which means that by the second half of the book he is no longer an explorer, but a bureaucrat overseeing the work of other explorers, including Frank Meyer, who I found more compelling than Fairchild himself. But this section also becomes mired in accounts of his courtship of and marriage to Marian Bell, the daughter of the inventor Alexander Graham, as well as Bell’s aeronautical competition with the Wright Brothers.

Such stories give a fuller picture of Fairchild’s life, but they sit awkwardly beside the frame of this as a story about the massive changes going on in American society or about the fascinating institutions that Fairchild helped create. In fact, the most iconic plants Fairchild had a hand in bringing to the US were either inedible (Washington DC’s flowering cherry trees) or not his finds (the Meyer lemon). Similarly, I was struck by the vast number of imported plants that were almost immediately supplanted or simply discarded. Fairchild and his program did change the way Americans eat in significant ways, but behind the glitz and glam of Fairchild’s life is a more compelling story about the growth of the commercial agriculture industry and the role of the federal government in both facilitating and inhibiting the import of new crops.

Second, this is a particularly American book. Stone frames the story against the backdrop of American industrial power and the story is built around the privilege of American interlopers cavalierly begging, stealing, or buying whatever they want to populate their new garden of Eden. I don’t want to pass any aspersions on Stone since he periodically offers light critiques of American ignorance, such as during a potential row between US and Japanese officials after the first batch of cherry trees had to be burned. Nevertheless, his sources are swept up in the potential of the US and the backwardness of most of the rest of the world and he is generally happy to echo their sentiments, and makes a few truly egregious gaffes along the way, such as in identifying Egypt as both “Mesopotamia” and “the birthplace of civilization.”

As noted above, there is a compelling story here and I can understand why so many people and at least one podcast I listened to raved about the book. The decision to follow Fairchild’s charmed life keeps it from getting too heavy with either discussions of institutions and business or war and death, but I closed it more more frustrated than enlightened.

ΔΔΔ

A short discussion of Vassilis Vassilikos’ Z, since I am likely not going to do a full summary: The first half of the book consists of non-stop action of a fateful night when a socialist politician is assassinated after a gathering in Thessaloniki by ruffians hired by the police, who simply stand by and watch. Much stronger, in my opinion, was the second half, which explored the inquests that followed and is highly critical of political officials who seek to sweep their complicity under the rug. My failure to write this up earlier has dimmed the individual characters in my memory, but I was repeatedly struck by the resonance with contemporary political agendas.

I have also finished Bilge Karasu’s The Garden of Departed Cats and am now reading Roberto Bolaño’s The Savage Detectives, a strange and sensual novel about a group of young poets who call themselves “the visceral realists.”

The Missing Course

David Gooblar’s The Missing Course offers a simple, but radical thesis: that improving college teaching requires shifting the mindset about what the product is the professor offers. It is easy to think that your product is your expertise in your content area, honed through years of study. Institutional structures in PhD programs and promotion standards reinforce this belief from one end, while, on the other, there is a temptation to think that the transaction the students are paying for is to have knowledge transmitted to them by a world renowned expert (you).

However, speaking as someone who took classes from some exceptional lecturers and loves the feeling of one of his own lectures landing with an audience: even the most inspiring lecturer will not connect with every student. Gooblar’s proposal follows in the vein of recent scholarship on teaching and learning that encourages teachers to eschew lectures in favor of shades of active learning, but with a critical addition: that the product is not the content, but the student.

This proposal seems obvious, but it also requires foundational changes in class design and assessment and simply bypassed the handwringing about why students aren’t capable of picking up the subtle themes and brilliant observations about life and everything. As Gooblar opens with in chapter one, is that you can’t make someone learn, so the challenge is finding ways to encourage learning beyond the punitive threat of a poor grade. The lecture works well, if still imperfectly, for students who are already interested in learning, but it works best as a gateway drug––a taste that prompts students to go out and get more. That is, the lecture works well for students who approach it as part of an active learning process. But too many others approach the lecture as something to passively receive, learn by rote, and regurgitate as best they are able on the test.

Each of The Missing Courses’ eight chapters approach a different aspect of this teaching, from the basic course design to assignments, to classroom activities, with practical, actionable suggestions to try. There are too many points to summarize here, but I found myself happy to find practices I use in my own classes like low-stakes weekly quizzes and extensive opportunities to revise assignments among his suggestions and still found myself jotting down new ideas.

As a history professor who believes in the importance of teaching writing across the curriculum, I was particularly excited to have suggestions from a professor of writing and rhetoric about how to encourage best practices in citations. For instance, he provides the most lucid explanation I have seen of why students struggle to cite secondary scholarship and will often only do so when citing direct quotes:

“To write a summary, the student must read the whole text (perhaps multiple times), think deeply about the most important aspects, and synthesize observations into a concise rendering of the text’s substance”

In other words: using secondary scholarship is hard and intimidating (what if you get it wrong!), where citing a quote is easy. I couldn’t tell you where I learned how to cite scholarship. I don’t remember being taught how at any point, it was just something I picked up by osmosis, so I very much appreciated seeing Gooblar’s suggestions on activities that can help teach these skills.

All of Gooblar’s suggestions come back to the student as the course material. Toward that end, he emphasizes the importance of respecting students as individuals even, or perhaps especially, when they are failing the course, and of facilitating the classroom as a community where not only are students and their ideas respected, but students may also help each other grow.

At this point you might be thinking, what about the content? If the students are taking a course on World History, shouldn’t they, you know, learn about World History? Of course the answer is yes, but, speaking from experience, thinking in terms of coverage is a trap. I tell my students in these classes that every class period (and usually every slide) could be a semester-long class course of its own, meaning that we only ever scratch the surface. Which is going to be more beneficial to the student in the long run: making sure that we spend ten minutes in a lecture talking about the Tibetan Empire of the second half of the first millennium CE, which is admittedly fascinating, or redirecting that time to primary source analysis, discussion, debate, practice summarizing and engaging with sources, or any of a myriad of other active learning techniques. Some of these are harder when teaching introductory courses where it seems like the students don’t have enough background to engage at the level you want and lectures are sometimes a necessary component of the class, but incorporating active learning into the course offers significant rewards.

Toward the end of the book, Gooblar turns his attention to how to teach in the modern, tumultuous world. I jotted a brief response thread on Twitter, but wanted to spotlight it again here. College professors are often accused of trying to indoctrinate their students into radical Marxism or the like. While American college professors do tend to be more liberal than conservative, the largest number actually self-classify as moderate. Further, the recent primary results have demonstrated that the Democratic party remains a big-tent coalition, while the Republican party, which has accelerated attacks on funding for higher education in recent years, has veered further right. The political doesn’t end at the classroom door and to pretend otherwise is naive.

As a history teacher I run into these problems with regularity and, to be honest believe that I can and should better handle them. Ancient Greek democracy was made possible by both exclusion (narrow participation that did not include women) and exploitation (Athens had many times the number of enslaved people as it did citizens). The spread of religions was at different points a blood-soaked process, Christianity included, and European colonization amounted to exploitation and indoctrination at best and either incidental or intentional genocide and ethnic cleansing at worst. And for all that I find history endlessly fascinating.

Gooblar suggests a similar approach to the one I’ve adopted, which is to “take seriously the equality of our students and the inequality of the world,” while placing an emphasis on process. There are some premises that I will not tolerate in my classroom, including endorsement of slavery, racism, sexism, and other forms of bigotry, but I also believe that there is room for students to argue for the virtue of, for instance, Athenian democracy and capitalism so long as their arguments are based on good use of available sources and I build time into the class period to have students practice these skills.

One of the virtues of a college classroom should giving students space to debate issues in a responsible and respectful manner: disagreements are okay, bullying is not.

The limiting factor in college teaching is not knowledge, but attention. Becoming a good teacher requires practice and cultivation, just like developing any other skill. Fortunately for anyone interested in improving their skill, we are currently living in a golden age of publications on teaching and learning. I haven’t finished everything on the list of resources I solicited a few years back, but The Missing Course is already my go-to recommendation for a place to start.

Winners Take All: The Elite Charade of Changing the World

“Walker had broken what in his circles were important taboos: Inspire the rich to do more good, but never, ever tell them to do less harm; inspire them to give back, but never, ever tell them to take less; inspire them to join the solution, but never, ever accuse them of being part of the problem.”

Just under one year ago international news was covering a crisis in Thailand involving a boys soccer team and their coach trapped in a cave by rising waters. For eighteen days the boys remained in the cave before rescue divers managed to get them out. One diver died in the operation. At the height of the coverage, Elon Musk stepped in, proposing that a Space-X mini-sub could aid the efforts, with much praise and no small amount of mockery from the workers on the ground. Musk responded by calling one of the rescue divers a “pedo.”

One the one hand, this story of a remarkable rescue ended successfully and Musk’s sideshow did not figure in to the result, but, on the other, it offers a microcosm of the phenomenon examined in Anand Giridharadas’ book Winners Take All. By most accounts, Musk wanted to do a good thing by saving the boys, but he wanted to do it from within his own niche and in a way that brought potential benefit for him in the form of publicity, influence, and potential profit down the line. When challenged to address a fundamental structural issue like the water crisis in Flint, Musk, predictably, fell silent.

Giridharadas argues that Musk and his fellow citizens of MarketWorld, that is, the global business and financial elite, want to effect positive change, but have reshaped the mechanisms for doing so to their own benefit. The result in this time of growing inequality is a pay-to-play circuit of philanthropy where undemocratic decisions are made by the wealthiest strata of society promoting a win-win, venture-capital ethos of making a profit while giving people what they “need,” usually in the form of entrepreneurship. In return for their generosity, these philanthropists sincerely believe that they deserve an outsized voice in public policy debates.

But this win-win mentality perpetuates and in fact exacerbates the problems that the new philanthropic agendas address, whether it is the lack of government funding (avoiding taxes), poverty (not paying workers), or climate change (e.g. unregulated industry). Hence the taboos of MarketWorld, standards of behavior for Thought Leaders that short-circuit any possibility of systemic change.

Winners Take All, as Giridharadas notes in his sources, is a work of reportage that profiles members of this global elite, including prominent speakers on the circuit that includes Ted Talks, leaders of philanthropic organizations, disillusioned financial insiders, and one former US president whose post-White House career has pivoted to canoodling with business elites.

Giridharadas does not question the overarching dedication to social justice in its broadest, most generic sense on the part of anyone he profiles.

(Conspicuously, there are people not named in the book, like the DeVos’ when it comes to education, who throw their money around in much the same way who he would not ascribe such virtuous intentions. For this, and for a better understanding of how charitable giving facilitates generational wealth transfer under the US tax code, I wish I had read Jane Mayer’s Dark Money before this one. Giridharadas does note, however, that this MarketWorld generally plays into Republican goals of limiting government.)

The problem as identified here is the system that exists in a positive-reinforcement echo-chamber. This system facilitates growing income inequality while acting like it doesn’t exist. This system pushes motivational talks by “Thought Leaders” and industries that insist every person should be their own business while ignoring both barriers and consequences of failure. This system updates Andrew Carnegie’s Wealth for a new century while pretending that the problems of the Gilded Age are gone…at the same time that an all-consuming focus on profit replicates many of the same crises.

The result, Giridharadas argues, is global resentment of the financial elite by millions of people left distrustful of a government that doesn’t appear to do anything, but clearly left out of vision of a techno-utopia created by the citizens of MarketWorld. I found this final conclusion that this system is the driving factor behind the rising tide of authoritarian nationalism somewhat overstated. It offers a neat explanation for the somewhat overblown narrative of the white working class propelling Donald Trump to the presidency, but whitewashes racism, dark money (See: Jane Mayer’s book), and the various avenues of attack on democracy.

But neither is he wrong. The developments covered in Winners Take All clearly contribute to the breakdown of social systems designed to protect civil society, though I was ultimately unconvinced that the do-gooders covered here constitute the majority of the global MarketWorld elite. The stronger insight here is that despite the wealth of those who do want to fix the world, MarketWorld thinking prevents them from addressing the underlying problems. This realization is more worrisome than identifying malicious actors, because if the systems designed to help the poorest citizens and organize a response to climate change are under attack even from the people who ostensibly want to help, what chance do they have?

ΔΔΔ

Next up, I spent most of the weekend reading. I finished James Baldwin’s If Beale Street Could Talk, being just blown away by the prose, and Archer Mayor’s Three Can Keep A Secret, part of a Vermont-based mystery series that is one of my comfort reads. I then started Jane Mayor’s Dark Money.