Pedagogy decluttered: On becoming a more minimalist teacher

I’ve decluttered the hell out of my house. Those ratty socks, unloved shirts, and broken lawn chairs are gone, gone, and gone. Though my simplifying journey is still underway, the benefits of doing more with less, of streamlining both the stuff and processes of my life couldn’t be clearer. Naturally, then, I’ve turned my minimalist eye to teaching, creating, I hope, more air and space for what is most essential in my work with students.

The actual practices I describe here aren’t new or innovative, but I hadn’t previous framed them in minimalist terms. Considering them this way — as a sort of pedagogical application of the Kon Mari method — helps me to make sense of, and better integrate, my teaching values with those shaping the rest of my life. As a North American woman in her fifth decade, I am perhaps typical of my demographic in my desire to free up space — both literally and figuratively — rather than to fill it, to seek experiences rather than stuff, and to do more with less. It is an impulse that, for me, at any rate, is ethical and spiritual, as well as aesthetic and practical, and so it’s no surprise that it has leaked into my thinking about teaching.


I notice, for example, that I’m increasingly eager to impose an order and structure on coursework and course design up front that severely reduces the need for daily decision-making along the way. Like wearing a sort of uniform each day — which I also do — having the class details laid out in advance saves me from having to fuss, dither, and scramble on a daily basis. My online class, in particular, is set up to run like clockwork, so that, barring catastrophes, I know precisely what I and my students should be doing each day. And our work occurs in a repetitive cycle that creates a breath-like rhythm that (I hope) allows us to focus on substance rather than the minutiae of instructions for clever, new assignments or changes in the order of readings.

I also see that I spend less and less time churning out expansive written feedback on individual student work. Rather than scribbling out detailed paragraphs on exams or essays, my process is increasingly spare and stylized. So, for example, I rely more on thoughtful rubrics or grading worksheets that include specific criteria, forcing me to be clearer about expectations up front. And, of course, thought it requires work in advance, it saves me time and grief during the busy flow of the semester. Though I’ve used rubrics for a while, aware of both their limitations and perks, I now see them as analogous to a capsule wardrobe. This practice of creating a painstakingly curated small collection of clothes, rather than limiting our choices, can, it seems, help free us up to focus on higher priority matters.

My final observation arises as I continue to minimize paper usage in terms of the number of handouts I supply, work to be submitted, and physical texts I assign. This is partly an influence of my online teaching, in which physical paper plays almost no role, and also resonates with my efforts at home to eliminate messy paper subscriptions, bills, receipts, etc. Some of my satisfaction results from the supposed environmental and money-saving aspects, of course, but minimizing paper also fits better with an aesthetic in which unnecessary props and accessories are cleared away. And, of course, the practical benefit of being able to access class texts or student assignments without schlepping a heavy backpack, is magical.


I know that such streamlining practices come with a cost. Adhering to a cyclical schedule of assignments entails a loss of spontaneity, and relying on stylized feedback structures like rubrics can feel impersonal. Having a structured, planned living situation, too, has its disadvantages. I find myself eating a lot of boiled eggs and wearing just a couple of black shirts because of my commitment to routine. Though it’s not for everyone, for me, adding such bits of structure creates flexibility in other areas. As the famously routinized Kant argued, imposing form and discipline can, paradoxically, increase the quality of one’s freedom. In the time I’m not fiddling with my clothes, I can walk, rather than drive, to work. Because I wasn’t up half the night scrawling comments on term papers, I am well rested when I connect with students, rather than resentful and grumpy.

Still, I am not proselytizing — I don’t think minimalism offers the best framework for teaching (or living) — nor do I think there’s one right way to be minimalist. Some of my fondest memories as a college student include explosively spontaneous professors who seemed barely affected by clocks, calendars, and no smoking signs. What I can report with confidence, though, is that minimalism is doing for my teaching what it does for my life. As my hiding places are cleared away, I am encouraged to be more honest with myself about how I spend my time and energy. With fewer opportunities for the seductive, distracting busywork that claims our hours, our days, and our lives, I occasionally get a glimpse of something that might really and truly matter.

Let’s take those anti-college Republicans at their word

Maverick educator though he was, Plato’s Socrates fretted about a new fangled technology known as writing. Relying on quill and papyrus, he worried, could wreck men’s memories and send his beloved Athens into a spiral of dull-witted decline. His concerns seem quaint, even silly, until we consider the recent Pew Center Report suggesting that most Republicans now think that college is bad for society. Certainly, it captures something about the red-blue divide since, at the same time, 70ish percent of “liberals” still think higher ed is pretty nifty.

On the one hand, there’s nothing to see here. Conservatives, especially religious fundamentalists, have long made a hobby of vilifying education, aware enough of its radicalizing potential to pursue radical means to control it. After all, Socrates died for his supposed heresies, to say nothing of poor Tycho Brahe, the long house arrest of Galileo, and the beatings inflicted on enslaved Africans learning to read. There are, unfortunately, endless examples of outraged conservatives silencing intellectuals and creatives in the name of God and country. The current anti-intellectualism in the U.S. too is grounded in a values divide with unbearably high stakes, including attitudes and policies about climate change, the rights of people of color, women, and immigrants, and what it means to be a free citizen.

If this weighs especially heavy on my mind, it is partly because I am a professor from a red town in a red state in a flamingly red region. I am, ostensibly, a veritable case study of the kid who went off to college and emerged unrepentantly and permanently dangerous to society. For anyone who thought I should have married a local boy, become a P.E. teacher (my mother’s early vision for me), and raised a few blond kids, college did, in fact wreck me. From the moment I arrived on campus — supported and encouraged by my father and step mother — worlds opened, intellectually, creatively and socially. Although I avoided the freshman weight gain, college helped me expand in every other respect. New paths led to new roads of experience and perspective that made me and my hometown ever stranger to one another. There was never much chance I would return to it or that it would welcome me if I did.


The narrative of the Republican far right — with much help from the kajillionairre Koch brothers and their ilk — is that colleges are left-wing cults, inculcating young people into extreme political liberalism and libertine lifestyles. And I guess the supposed divide between the values of small town America and the dangerous “college type” is perfectly realized in me, a lesbian in a Subaru who eats organic, reads a ton — I am a philosopher — and hasn’t set foot in a church since MC Hammer rocked those iconic pants. As a professor who teaches such “politically charged” courses as LGBT Studies and Queer Theory, I am the poster child of what conservatives object to about higher ed today, a threat to their very way of life.

Except that, as an independent-minded critic of unearned social and economic privilege, my hard working father helped radicalize me long before I went off to college. And my uneducated mother’s eclectic and open-minded approach to friends, food and books set me up to embrace the ideological and aesthetic challenges I encountered on campus. Anyone who blames college for ruining me has no idea how annoyingly philosophical and incipiently political I already was before college had its way with me. It’s probably just as fair to say that college made me a more mature version of myself than that it fundamentally changed me. I suspect this is true for most college students though, of course, I can’t say. But it does seem that those extreme, anti-college Republicans both underestimate and overestimate the influence that the experience has on actual young people.

Anyone truly surprised by this “new” anti-college stance underestimates the power and tenacity of America’s grand tradition of anti-intellectualism, its ties to religious fundamentalism, and the impact of economic disparity and the public disinvestment in higher ed (which is, of course, partly a product of anti-intellectualism). When one adds in the concerted anti-college media campaigns of college-educated fat cats, it is a miracle that all of red America is not disgusted by professors like me. And, no surprise, it turns out that it’s mostly the non-college educated Republicans who are so vehemently against it, like those home-bound Americans who insist with great authority that Europe is overrated. The way to get more popular support for college, as for most worthwhile experiences, is almost certainly to make it more available which is, perhaps, partly why so many Republican fundamentalists fight to make it inaccessible.


At any rate, this current flare of anti-intellectualist religious fundamentalism does us professors and society great harm. It can result in our being harassed, fired, and much, much worse. But what it cannot do is compel us to reason with it, or, in hand-wringing fashion, to psychologize it in some pseudo compassionate attempt to understand those benighted red-staters. We need not debase ourselves or our critics by second guessing or applying deeper motives to such proud ignorance. There is nothing shameful about ignorance, of course, but I can say with perfect ease that the proudly ignorant should damn well be ashamed of themselves. Though I can strive to understand the climate deniers, conspiracy theorists, and the new crop of flat-earthers as a sort of sociologist or anthropologist might, it is not as one citizen respectfully engaging with another in healthy, authentic dialogue.

The fundamentalists burned Tycho Brahe at the stake, but they could not compel him to make apologies for their murderous behavior. If the Republican fundamentalists wish to scapegoat higher ed, then let’s college types respect them enough to take them at their word. We do them no favors by talking about them or to them as if they were children or fools to be placated. Such pious “understanding,” of course, is the very bleeding heart liberal strategy that they despise. Instead of trying to argue with them about how awesome we are, we should continue to do our jobs well and focus on higher ed accessibility. Those who go to college may not fall in love with the ivory towers and ivy-covered walls, but very few will leave concluding it is professors, or knowledge itself, that is responsible for the rising tide of greed, nastiness and national insecurity.

Goodbye, 2017! The endings that define a professor’s life

The cyclical rhythm of the academic year is one of the greatest perks of being a professor. Many of us still know the buzz of possibility when summer ends and the campus parking lots begin filling up again. The winter break, too, brief as it is, can be as restorative as a deep, purifying breath. The poignancy of the fall semester’s ending is enhanced for me, no doubt, by the darkening days leading to and from the solstice and into uncharted seas. I suspect that even the most cynical among us can hear the whispered promises each new year brings.

As a unit of time, the semester has great power even though its parameters are arbitrary. Whatever rationale the academic calendar may once have had means nothing to most of us now. And, of course, despite our fealty to semesters, we routinely violate their boundaries by working with students on long-term projects, as well as our own ongoing service commitments and scholarly work. Summer teaching too, especially when done online, disrupts the familiar stop-and-start school year rhythm. Most of us are not, it is clear, rigidly wedded to traditional academic rhythms and could not flourish in our jobs if we were.

And it’s also not true that we are on break in any normal sense. I gently bit the head off the last person who asked me if I had big plans for my winter “vacation.” The truth is, that though last semester’s grades have been submitted, I’m busily retooling my courses for next term, revising a paper to submit early in the new year, and fielding questions and requests from previous and upcoming students. There are all sorts of deadlines and due dates that have no respect for the fact that I am on “break.” But even so, there is this peaceful sense of closure, of a chapter ending in my professional life, that most workers never enjoy.


I suspect that this stop-and-start rhythm may also help account for the effective teacher-student relationships many of us form. I can give my students fifteen weeks of more or less steady energy, and expect them to pony up some in return, partly because our time together is well-bounded and brief. For better and worse, it is often easiest to lend one’s best self to a new acquaintance with whom one is merely taking a short journey than to, say, a life partner. The pressure that the semester’s looming, inevitable ending places on the participants, an ending that is always visible on the horizon, can nourish the productive intensity of the classroom experience.

A school term can perform this framing function, I think, whether it’s a “good” semester, full of energetic and capable students, or a “bad” one in which things never fully click. Surely there’s some broader lesson in impermanence and detachment here? How, I wonder, can we move into each new term, with hands more fully open, embracing the knowledge that, whatever else these upcoming months bring, it will all end? Whether we and our students love and respect one another or are mutually repelled, soon it will be over. Surely, in light of this benevolent, merciless constraint of time, I can offer my full presence and attention, right? And, besides, isn’t the school term just a microcosm of the structure of time that bounds and gives shape to our very lives?


It is no wonder, then, that, for fortunate professors like me, these thrumming beginnings and solemn denouements form the basic architecture of our professional identities. It’s a rhythm that mirrors the habitual relief of falling into bed each night and the hopeful possibility that can nudge one out of bed each morning. This drip-drip-drip of time is almost comforting in its honesty, and the response it invites from me is just as straightforward: I am to accept the weight of yet another snowy Midwestern winter as I clear my desktop and my driveway in these waning days of 2017. As I move forward, my left foot finds the oblique, diffuse light of the future while, for a long moment, my right one remains in the shadowy past.

I wish I were fool enough to believe that this unwritten, upcoming calendar year might magically wash away the grinding horror of our national circumstances, not to mention the stains of a remarkably difficult year for me personally. But though 2017 brought me a heaping portion of death, illness, and disruption, I do not want to rush through its ending. Instead, I choose to bask in this caesura, this animated liminality, like a hibernating frog on a muddy pond bottom. And I dare anyone to repeat the tired accusation that professors are eternal schoolchildren, never having matured enough to enter the “real world.” What could be more real, more elemental and momentous, than letting go and starting over again and again and again?

Fake News, Willful Ignorance and Critical Thinking

Against the much maligned backdrop of contemporary higher education a bright light has been trained on the problem of fake news. Critical thinking is hot right now, suddenly spoken of as an urgent necessity rather than an abstract or faraway good. Those of us who’ve long been worried about folks’ capacity for basic reasoning and factual discrimination are justified in feeling newly energized. It has been confirmed, we are told, that many Americans cannot tell the difference between fact and fiction. Democracy is in danger and we must commit to developing students’ critical thinking skills with renewed vigor.

A philosopher by training, I’m grumpier than many about the ubiquity of poor thinking skills. In fact, some years ago my grief over anti-intellectualism and the disregard of science, facts and common sense drove me to a crisis of faith that impacted my scholarship, teaching, and sense of place in the world. The persistence of racism, embarrassingly literal strains of religious fundamentalism, and climate change denial were among the bullies that pushed me to reappraise my naive confidence in reason, facts and intellectual self-scrutiny. Shortly after 9/11, during a period of sorrow and brittle fury, several tentative conclusions took shape in the jingoistic, anti-Muslim, anti-gay miasma that surrounded me.


For one thing, I noticed that many who failed to distinguish between this or that fact also couldn’t appreciate the more general differences between fact and fiction. Addressing the problem, then, wouldn’t merely require adding information so as to improve a discrete skill — like teaching someone to identify a Bobolink, say — but would require the awareness and development of at least a few basic points about the stubbornness of reality and of our accountability to it. I also had to accept that many who systematically blurred fact with fiction, and logic with wishful thinking, did so quite happily. They were, then, unmoved by arguments that assumed that they did or should care about being accurate or reasonable. Their blithe disregard makes sense given how often we are rewarded for the size of our enthusiasm rather than the defensibility of our positions. One’s football team wins and one’s political candidate comes to power in the midst of a righteous, passionate, confirming din.

When it comes to sports teams and rock stars it’s easy enough to appreciate that emotional forces will be more determinative than rational ones. But what if our beliefs about most things are determined to some extent by how holding those beliefs makes us feel? To paraphrase William James, what if we are inclined to believe that which makes us feel good? And what if this isn’t so much an individual failure — people going astray — but reflects something of our nature? If we are such deeply affective creatures, then emotions must be front and center as we address the problem of fake news and critical thinking.


If students’ orientation to reason and facts is nested so intimately into their psychological and social selves, then it’s not enough to think of critical thinking as a mere skill. In fact, I’ve come to think of it as like learning a new language. It is open ended, often tentative and halting, and progresses in fits and starts. Further, the greatest strides occur through immersion into a culture where the contextual relevance, including concrete rewards and penalties associated with mastering it, emerge. Of course, most native English-speaking U.S. students who study another language in school never really learn it, just as they may never really learn critical thinking, even from courses focused on precisely that.

The capacity for critical thinking is also like second language fluency in that many who claim to value it do not, not really, and may not even know that they don’t. There’s an unfortunate circularity here in that being able to identify the bad consequences of poor thinking relies on the very reasoning skill and ethos that is missing in the first place. The problem, then, isn’t just a lack of critical thinking skills, but a lack of sufficient critical thinking skills to even recognize the initial lack. Similarly, poor language speakers often overestimate their ability precisely because their poor skills blind them to their missteps. Adding more lessons in logic, or new lists of vocabulary words, important as this may be, is unlikely to effectively combat this vortex of ignorance, especially since its swirls are invisible to many who are drowning in it.

All this to say that, while learning discrete critical thinking skills is important — just as learning to conjugate verbs is helpful — genuine leaps of ability probably can’t occur without serious attention to underlying emotional and social motivations. Apparently, we are creatures who must sometimes be jolted into noticing and caring about the size and shape of our own ignorance. I am reminded that those yanked from Plato’s cave did not rejoice in the harsh daylight, but were initially pained. The fundamental danger of fake news, then, may not result primarily from a deficit of intellectual skill — though, of course, this matters — but a lack of will. With this hypothesis in mind, I try to focus as much on helping students want to think better as on improving thinking skills. As I explore in an upcoming post, I’m trying to work with them to more viscerally connect intellectual mastery with their personal and professional goals. Though, as we know, thinking patiently and well can bring its own satisfaction, its pleasures and rewards can be lost — to all of us — in the bright lights and deafening roar of a self-satisfied crowd.

On being a professor with regrets

Admitting that we wish we’d done things differently has come to be seen as a mark of spiritual immaturity. Perhaps as a reaction to the guilt-inducing traditional religions of childhood, many have adopted a policy of embracing whatever has occurred as a way of celebrating the present moment. While banishing regret may be fine as an absolute orientation towards the deepest meaning of life — on this view, what IS is good precisely because it is — on a mundane level, I think regret can be a useful ally.

Regret is especially relevant to me as a professor in this twixt time between the fall and spring terms. I look back on Fall with one eye as I look ahead to Spring with the other. The invitation to ruthlessly inspect my courses, to locate both the gems and dross, the tangled thickets and the open clearings, is too loud to be ignored. But still so close to the beauty and the wreckage of classes I’m just now completing, my vision is both sharpened and distorted. Learning to take a critical perspective on a past that is only just barely past demands that I move quickly away from defensive self-justification and make friends with regret.

Specifically, constructive regret requires that I be:

  • secure enough in my identity as a competent teacher that I can afford to have been mistaken about this or that; insecurity about my basic ability will lead me to defend and justify rather than honestly scrutinize;
  • invested not just in improving this or that particular skill or product, but in growing as a whole human being. Then, the motive towards general excellence can become habitual and irresistible; if I am satisfied with coasting dumbly along, either as a teacher, or as a moral, intellectual animal, then I won’t be motivated enough to make deep, lasting changes in any part of my life, including my teaching.

If I can make room for constructive regret in my teaching life — if I can see that that one assignment, the one I really loved, turned out to be a flop — then maybe I can also have a freer, more responsible relationship with the people and events that make up my whole life. If I can see failures — large and small — as messengers, and avoid identifying with them, then I can take better advantage of regret. Seeking and finding my own missteps and shortcomings — like consulting a map at a rest stop — can increasingly become a neutral habit rather than a shaming interlude that I avoid at all costs.

The pitfall of regret, then, is that it can so quickly become an implement for ruthless self-flagellation. One’s personal history and insecurities rise up so powerfully that the prospect of being vulnerable to self-examination becomes intolerable and so, instead, one moves fluidly into self-justification and rationalization. “I had to do it that way, because…” we tell ourselves, instead of authentically reflecting on the details of our motives or the consequences we set into motion. Rationalization becomes as automatic as a gag reflex, neutralizing the natural curiosity that would have us inspect and learn from our past.

There isn’t much that we do, whether in our classrooms or our larger lives, that absolutely had to be precisely the way it was. In most cases, we had viable alternative routes. Whether it’s about permitting a student to make up a quiz or speaking harshly to the person we love most, we can usually have done otherwise. And though we cannot, of course, know absolutely what the future would have been, our limited capacity to anticipate the consequences of our actions should, I think, sometimes lead us toward regret. How can we, I wonder, become more at home in the lively, tense knowledge that we could have, and perhaps should have, done it differently?


Extra credit as expiation and discount

In these last frenzied weeks of the semester, undergrads everywhere begin genuflecting madly at the feet of the extra credit god. In this end-of-term ritual, instructors usher these harried students into our office-cum-confessional and bear witness as they perform rites of self-flagellation and earnest piety. “If only I could do some extra credit,” they muse with exaggerated innocence, eyes tilted skyward like an El Greco Jesus. “Isn’t there anything I could do to improve my grade?”

My response to this desperate query is usually an unsatisfyingly simple: “Would you be willing to study for the final exam? Could you attend the remaining class periods so that you don’t continue to miss out on precious participation and quiz points?” My unglamorous reply disappoints, because this student is here to talk about extra credit and often has no real interest in mastering the material or doing better on the remaining class assignments. It is, rather, the indulgence and favor associated with the special opportunity of extra credit that has drawn her to my door.

If you’ve ever watched an enthusiastic, practiced buyer wrangle with a car salesman, you’ll know what I mean. Such a buyer gets a thrill from feeling that, by dint of his own personal initiative, charm, and force of character, he has earned a special deal. Those other chumps might take the dealership’s offer at face value, but not this especially empowered, skillful consumer. And whether the actual price he pays for his new vehicle is better than what most purchasers end up paying becomes almost beside the point. What matters most to him is the heady satisfaction of feeling that he has played the game and won. It is similar with some students in that their negotiation for extra credit is often only obliquely related to the goal of improving their grade. And to many of them, too, our course syllabus is to be taken about as seriously as a new car’s window sticker.


This scenario would be more manageable if it were not that this competitive consumerist impulse converges with a quasi-moral one. Humble, but firm, requests for extra credit are also meant to demonstrate one’s deep earnestness, as a person and as a student. But that this is primarily a mere performance of diligence and responsibility is often belied by the fact that the student may ignore the obvious, unsexy, built-in paths to success, say, to actually learn the material and excel on the remaining assignments. “Extra credit” functions as an incantation, one that, like a priest’s blessing, is meant to erase twelve weeks’ of distraction or sloth. It’s hardly a surprise, then, that I meet few students as impassioned about working with me to excel on final projects as they are they are about concocting opportunities for extra credit.

I am prepared, as I write this, to hear from colleagues with objections to my flippancy about, and suspicion of, extra credit. There are lots of good reasons to encourage it, they will assure me. Perhaps so. In a complicated higher education economy nourished and impelled by some combination of desperation and dispensation, extra credit may well be one of the best games in town. And, certainly, each instructor must negotiate her own way in this imperfect, contradictory pedagogical environment, one in which student-consumers are unevenly, and unpredictably, empowered.

For now, though, I think I will continue to play the extra credit gadfly. I’ll still include flexibility in my syllabus, in the form of dropped quiz scores and allowances for reasonable absences, but I will explicitly exclude extra credit. And I will almost certainly continue to look just a little puzzled when students plead in week twelve or thirteen, “Isn’t there anything I can do, anything at all, to earn more points?” I’ll reply, as I do now, “Could you, perhaps, come to class and study for the final exam?”

On the uses and abuses of gratitude

We know we’re supposed to be grateful. It’s a year-round pressure that culminates on Thanksgiving: to count our blessings, look on the positive side, and remember how very fortunate we are. It’s even become a sort of medical prescription, with mental health professionals claiming that gratitude is the key to happiness, long life, and success. I don’t doubt it, but I also recall Karl Marx’s warnings about apparently anodyne feel-good ideologies that function like opium to help keep workers, including professors, cowed and complacent.

Even before the puddle of cranberry sauce dries on my plate, then, I think about how injunctions to be grateful, including those that come from oneself, can become fodder for quietism and bland self-satisfaction. When I consider, for example, the salary hit I will take as the result of huge increases to my insurance, I vacillate between relief — my situation is still much better than that of most people in the U.S. — and anger. How long am I supposed to suck it up and smile as my standard of living is eroded so that fat cats can get even fatter? Am I to compare myself only to those worse off than I am to avoid feeling, and being perceived as, elitist?


This gratitude double-bind is familiar, including to those of us in higher ed. On the one hand, we are aware enough of how tough times are to be grateful for full time faculty jobs. After all, this is an environment in which endangered faculty positions are being hunted down and casually ground into cheap instructor labor. And we mid-career professors watch with horror and sadness as newly minted PhDs continue to roll off the academic assembly line with little prospect of finding jobs half as secure as those we enjoy. We watch as the dignity of our profession is stripped away and, unless we are utterly obtuse, we can’t help but feel gratitude for our own good fortune.

But we are rightfully critical too, and aware of the distance between where public higher education is and where, in a prosperous, enlightened society, it might be. We wince and gnash our teeth at polls reporting that Republicans blame higher ed for the nation’s woes, and we see the writing on the wall. Whatever the future of public higher education holds in store, it is hard to believe it will survive in a form most academics would recognize or prefer.


Gratitude, then, like so many spiritually tinged notions, is double-edged. On the one hand, it is a vitally necessary and beautifully human impulse. Surely there is no one more miserable or pathetic than one who constantly complains, the perennial victim who is unable to access any sense of appreciation or agency. But in the quest to be that optimistic, spiritual person, it can be tempting to settle permanently into the narcotizing arms of gratitude, especially when others are urging us to “lighten up” and “count your blessings.” We desperately need, though, the sort of vigorous social protest that often emerges from visceral, contagious dissatisfaction.

If I am to be grateful, then, let me be fiercely, and not complacently, so. Let my gratitude for my own good fortune galvanize me into fighting for the same benefits for others that I now enjoy. Let me freely express my discontent and desire for a better world, impelled by appreciation for what is beautiful and good in my life, and not to be shamed into silence by fear that I will be seen as just another whining, overindulged academic.