Goodbye, 2017! The endings that define a professor’s life

The cyclical rhythm of the academic year is one of the greatest perks of being a professor. Many of us still know the buzz of possibility when summer ends and the campus parking lots begin filling up again. The winter break, too, brief as it is, can be as restorative as a deep, purifying breath. The poignancy of the fall semester’s ending is enhanced for me, no doubt, by the darkening days leading to and from the solstice and into uncharted seas. I suspect that even the most cynical among us can hear the whispered promises each new year brings.

As a unit of time, the semester has great power even though its parameters are arbitrary. Whatever rationale the academic calendar may once have had means nothing to most of us now. And, of course, despite our fealty to semesters, we routinely violate their boundaries by working with students on long-term projects, as well as our own ongoing service commitments and scholarly work. Summer teaching too, especially when done online, disrupts the familiar stop-and-start school year rhythm. Most of us are not, it is clear, rigidly wedded to traditional academic rhythms and could not flourish in our jobs if we were.

And it’s also not true that we are on break in any normal sense. I gently bit the head off the last person who asked me if I had big plans for my winter “vacation.” The truth is, that though last semester’s grades have been submitted, I’m busily retooling my courses for next term, revising a paper to submit early in the new year, and fielding questions and requests from previous and upcoming students. There are all sorts of deadlines and due dates that have no respect for the fact that I am on “break.” But even so, there is this peaceful sense of closure, of a chapter ending in my professional life, that most workers never enjoy.


I suspect that this stop-and-start rhythm may also help account for the effective teacher-student relationships many of us form. I can give my students fifteen weeks of more or less steady energy, and expect them to pony up some in return, partly because our time together is well-bounded and brief. For better and worse, it is often easiest to lend one’s best self to a new acquaintance with whom one is merely taking a short journey than to, say, a life partner. The pressure that the semester’s looming, inevitable ending places on the participants, an ending that is always visible on the horizon, can nourish the productive intensity of the classroom experience.

A school term can perform this framing function, I think, whether it’s a “good” semester, full of energetic and capable students, or a “bad” one in which things never fully click. Surely there’s some broader lesson in impermanence and detachment here? How, I wonder, can we move into each new term, with hands more fully open, embracing the knowledge that, whatever else these upcoming months bring, it will all end? Whether we and our students love and respect one another or are mutually repelled, soon it will be over. Surely, in light of this benevolent, merciless constraint of time, I can offer my full presence and attention, right? And, besides, isn’t the school term just a microcosm of the structure of time that bounds and gives shape to our very lives?


It is no wonder, then, that, for fortunate professors like me, these thrumming beginnings and solemn denouements form the basic architecture of our professional identities. It’s a rhythm that mirrors the habitual relief of falling into bed each night and the hopeful possibility that can nudge one out of bed each morning. This drip-drip-drip of time is almost comforting in its honesty, and the response it invites from me is just as straightforward: I am to accept the weight of yet another snowy Midwestern winter as I clear my desktop and my driveway in these waning days of 2017. As I move forward, my left foot finds the oblique, diffuse light of the future while, for a long moment, my right one remains in the shadowy past.

I wish I were fool enough to believe that this unwritten, upcoming calendar year might magically wash away the grinding horror of our national circumstances, not to mention the stains of a remarkably difficult year for me personally. But though 2017 brought me a heaping portion of death, illness, and disruption, I do not want to rush through its ending. Instead, I choose to bask in this caesura, this animated liminality, like a hibernating frog on a muddy pond bottom. And I dare anyone to repeat the tired accusation that professors are eternal schoolchildren, never having matured enough to enter the “real world.” What could be more real, more elemental and momentous, than letting go and starting over again and again and again?

Fake News, Willful Ignorance and Critical Thinking

Against the much maligned backdrop of contemporary higher education a bright light has been trained on the problem of fake news. Critical thinking is hot right now, suddenly spoken of as an urgent necessity rather than an abstract or faraway good. Those of us who’ve long been worried about folks’ capacity for basic reasoning and factual discrimination are justified in feeling newly energized. It has been confirmed, we are told, that many Americans cannot tell the difference between fact and fiction. Democracy is in danger and we must commit to developing students’ critical thinking skills with renewed vigor.

A philosopher by training, I’m grumpier than many about the ubiquity of poor thinking skills. In fact, some years ago my grief over anti-intellectualism and the disregard of science, facts and common sense drove me to a crisis of faith that impacted my scholarship, teaching, and sense of place in the world. The persistence of racism, embarrassingly literal strains of religious fundamentalism, and climate change denial were among the bullies that pushed me to reappraise my naive confidence in reason, facts and intellectual self-scrutiny. Shortly after 9/11, during a period of sorrow and brittle fury, several tentative conclusions took shape in the jingoistic, anti-Muslim, anti-gay miasma that surrounded me.


For one thing, I noticed that many who failed to distinguish between this or that fact also couldn’t appreciate the more general differences between fact and fiction. Addressing the problem, then, wouldn’t merely require adding information so as to improve a discrete skill — like teaching someone to identify a Bobolink, say — but would require the awareness and development of at least a few basic points about the stubbornness of reality and of our accountability to it. I also had to accept that many who systematically blurred fact with fiction, and logic with wishful thinking, did so quite happily. They were, then, unmoved by arguments that assumed that they did or should care about being accurate or reasonable. Their blithe disregard makes sense given how often we are rewarded for the size of our enthusiasm rather than the defensibility of our positions. One’s football team wins and one’s political candidate comes to power in the midst of a righteous, passionate, confirming din.

When it comes to sports teams and rock stars it’s easy enough to appreciate that emotional forces will be more determinative than rational ones. But what if our beliefs about most things are determined to some extent by how holding those beliefs makes us feel? To paraphrase William James, what if we are inclined to believe that which makes us feel good? And what if this isn’t so much an individual failure — people going astray — but reflects something of our nature? If we are such deeply affective creatures, then emotions must be front and center as we address the problem of fake news and critical thinking.


If students’ orientation to reason and facts is nested so intimately into their psychological and social selves, then it’s not enough to think of critical thinking as a mere skill. In fact, I’ve come to think of it as like learning a new language. It is open ended, often tentative and halting, and progresses in fits and starts. Further, the greatest strides occur through immersion into a culture where the contextual relevance, including concrete rewards and penalties associated with mastering it, emerge. Of course, most native English-speaking U.S. students who study another language in school never really learn it, just as they may never really learn critical thinking, even from courses focused on precisely that.

The capacity for critical thinking is also like second language fluency in that many who claim to value it do not, not really, and may not even know that they don’t. There’s an unfortunate circularity here in that being able to identify the bad consequences of poor thinking relies on the very reasoning skill and ethos that is missing in the first place. The problem, then, isn’t just a lack of critical thinking skills, but a lack of sufficient critical thinking skills to even recognize the initial lack. Similarly, poor language speakers often overestimate their ability precisely because their poor skills blind them to their missteps. Adding more lessons in logic, or new lists of vocabulary words, important as this may be, is unlikely to effectively combat this vortex of ignorance, especially since its swirls are invisible to many who are drowning in it.

All this to say that, while learning discrete critical thinking skills is important — just as learning to conjugate verbs is helpful — genuine leaps of ability probably can’t occur without serious attention to underlying emotional and social motivations. Apparently, we are creatures who must sometimes be jolted into noticing and caring about the size and shape of our own ignorance. I am reminded that those yanked from Plato’s cave did not rejoice in the harsh daylight, but were initially pained. The fundamental danger of fake news, then, may not result primarily from a deficit of intellectual skill — though, of course, this matters — but a lack of will. With this hypothesis in mind, I try to focus as much on helping students want to think better as on improving thinking skills. As I explore in an upcoming post, I’m trying to work with them to more viscerally connect intellectual mastery with their personal and professional goals. Though, as we know, thinking patiently and well can bring its own satisfaction, its pleasures and rewards can be lost — to all of us — in the bright lights and deafening roar of a self-satisfied crowd.

On being a professor with regrets

Admitting that we wish we’d done things differently has come to be seen as a mark of spiritual immaturity. Perhaps as a reaction to the guilt-inducing traditional religions of childhood, many have adopted a policy of embracing whatever has occurred as a way of celebrating the present moment. While banishing regret may be fine as an absolute orientation towards the deepest meaning of life — on this view, what IS is good precisely because it is — on a mundane level, I think regret can be a useful ally.

Regret is especially relevant to me as a professor in this twixt time between the fall and spring terms. I look back on Fall with one eye as I look ahead to Spring with the other. The invitation to ruthlessly inspect my courses, to locate both the gems and dross, the tangled thickets and the open clearings, is too loud to be ignored. But still so close to the beauty and the wreckage of classes I’m just now completing, my vision is both sharpened and distorted. Learning to take a critical perspective on a past that is only just barely past demands that I move quickly away from defensive self-justification and make friends with regret.

Specifically, constructive regret requires that I be:

  • secure enough in my identity as a competent teacher that I can afford to have been mistaken about this or that; insecurity about my basic ability will lead me to defend and justify rather than honestly scrutinize;
  • invested not just in improving this or that particular skill or product, but in growing as a whole human being. Then, the motive towards general excellence can become habitual and irresistible; if I am satisfied with coasting dumbly along, either as a teacher, or as a moral, intellectual animal, then I won’t be motivated enough to make deep, lasting changes in any part of my life, including my teaching.

If I can make room for constructive regret in my teaching life — if I can see that that one assignment, the one I really loved, turned out to be a flop — then maybe I can also have a freer, more responsible relationship with the people and events that make up my whole life. If I can see failures — large and small — as messengers, and avoid identifying with them, then I can take better advantage of regret. Seeking and finding my own missteps and shortcomings — like consulting a map at a rest stop — can increasingly become a neutral habit rather than a shaming interlude that I avoid at all costs.

The pitfall of regret, then, is that it can so quickly become an implement for ruthless self-flagellation. One’s personal history and insecurities rise up so powerfully that the prospect of being vulnerable to self-examination becomes intolerable and so, instead, one moves fluidly into self-justification and rationalization. “I had to do it that way, because…” we tell ourselves, instead of authentically reflecting on the details of our motives or the consequences we set into motion. Rationalization becomes as automatic as a gag reflex, neutralizing the natural curiosity that would have us inspect and learn from our past.

There isn’t much that we do, whether in our classrooms or our larger lives, that absolutely had to be precisely the way it was. In most cases, we had viable alternative routes. Whether it’s about permitting a student to make up a quiz or speaking harshly to the person we love most, we can usually have done otherwise. And though we cannot, of course, know absolutely what the future would have been, our limited capacity to anticipate the consequences of our actions should, I think, sometimes lead us toward regret. How can we, I wonder, become more at home in the lively, tense knowledge that we could have, and perhaps should have, done it differently?


Extra credit as expiation and discount

In these last frenzied weeks of the semester, undergrads everywhere begin genuflecting madly at the feet of the extra credit god. In this end-of-term ritual, instructors usher these harried students into our office-cum-confessional and bear witness as they perform rites of self-flagellation and earnest piety. “If only I could do some extra credit,” they muse with exaggerated innocence, eyes tilted skyward like an El Greco Jesus. “Isn’t there anything I could do to improve my grade?”

My response to this desperate query is usually an unsatisfyingly simple: “Would you be willing to study for the final exam? Could you attend the remaining class periods so that you don’t continue to miss out on precious participation and quiz points?” My unglamorous reply disappoints, because this student is here to talk about extra credit and often has no real interest in mastering the material or doing better on the remaining class assignments. It is, rather, the indulgence and favor associated with the special opportunity of extra credit that has drawn her to my door.

If you’ve ever watched an enthusiastic, practiced buyer wrangle with a car salesman, you’ll know what I mean. Such a buyer gets a thrill from feeling that, by dint of his own personal initiative, charm, and force of character, he has earned a special deal. Those other chumps might take the dealership’s offer at face value, but not this especially empowered, skillful consumer. And whether the actual price he pays for his new vehicle is better than what most purchasers end up paying becomes almost beside the point. What matters most to him is the heady satisfaction of feeling that he has played the game and won. It is similar with some students in that their negotiation for extra credit is often only obliquely related to the goal of improving their grade. And to many of them, too, our course syllabus is to be taken about as seriously as a new car’s window sticker.


This scenario would be more manageable if it were not that this competitive consumerist impulse converges with a quasi-moral one. Humble, but firm, requests for extra credit are also meant to demonstrate one’s deep earnestness, as a person and as a student. But that this is primarily a mere performance of diligence and responsibility is often belied by the fact that the student may ignore the obvious, unsexy, built-in paths to success, say, to actually learn the material and excel on the remaining assignments. “Extra credit” functions as an incantation, one that, like a priest’s blessing, is meant to erase twelve weeks’ of distraction or sloth. It’s hardly a surprise, then, that I meet few students as impassioned about working with me to excel on final projects as they are they are about concocting opportunities for extra credit.

I am prepared, as I write this, to hear from colleagues with objections to my flippancy about, and suspicion of, extra credit. There are lots of good reasons to encourage it, they will assure me. Perhaps so. In a complicated higher education economy nourished and impelled by some combination of desperation and dispensation, extra credit may well be one of the best games in town. And, certainly, each instructor must negotiate her own way in this imperfect, contradictory pedagogical environment, one in which student-consumers are unevenly, and unpredictably, empowered.

For now, though, I think I will continue to play the extra credit gadfly. I’ll still include flexibility in my syllabus, in the form of dropped quiz scores and allowances for reasonable absences, but I will explicitly exclude extra credit. And I will almost certainly continue to look just a little puzzled when students plead in week twelve or thirteen, “Isn’t there anything I can do, anything at all, to earn more points?” I’ll reply, as I do now, “Could you, perhaps, come to class and study for the final exam?”

On the uses and abuses of gratitude

We know we’re supposed to be grateful. It’s a year-round pressure that culminates on Thanksgiving: to count our blessings, look on the positive side, and remember how very fortunate we are. It’s even become a sort of medical prescription, with mental health professionals claiming that gratitude is the key to happiness, long life, and success. I don’t doubt it, but I also recall Karl Marx’s warnings about apparently anodyne feel-good ideologies that function like opium to help keep workers, including professors, cowed and complacent.

Even before the puddle of cranberry sauce dries on my plate, then, I think about how injunctions to be grateful, including those that come from oneself, can become fodder for quietism and bland self-satisfaction. When I consider, for example, the salary hit I will take as the result of huge increases to my insurance, I vacillate between relief — my situation is still much better than that of most people in the U.S. — and anger. How long am I supposed to suck it up and smile as my standard of living is eroded so that fat cats can get even fatter? Am I to compare myself only to those worse off than I am to avoid feeling, and being perceived as, elitist?


This gratitude double-bind is familiar, including to those of us in higher ed. On the one hand, we are aware enough of how tough times are to be grateful for full time faculty jobs. After all, this is an environment in which endangered faculty positions are being hunted down and casually ground into cheap instructor labor. And we mid-career professors watch with horror and sadness as newly minted PhDs continue to roll off the academic assembly line with little prospect of finding jobs half as secure as those we enjoy. We watch as the dignity of our profession is stripped away and, unless we are utterly obtuse, we can’t help but feel gratitude for our own good fortune.

But we are rightfully critical too, and aware of the distance between where public higher education is and where, in a prosperous, enlightened society, it might be. We wince and gnash our teeth at polls reporting that Republicans blame higher ed for the nation’s woes, and we see the writing on the wall. Whatever the future of public higher education holds in store, it is hard to believe it will survive in a form most academics would recognize or prefer.


Gratitude, then, like so many spiritually tinged notions, is double-edged. On the one hand, it is a vitally necessary and beautifully human impulse. Surely there is no one more miserable or pathetic than one who constantly complains, the perennial victim who is unable to access any sense of appreciation or agency. But in the quest to be that optimistic, spiritual person, it can be tempting to settle permanently into the narcotizing arms of gratitude, especially when others are urging us to “lighten up” and “count your blessings.” We desperately need, though, the sort of vigorous social protest that often emerges from visceral, contagious dissatisfaction.

If I am to be grateful, then, let me be fiercely, and not complacently, so. Let my gratitude for my own good fortune galvanize me into fighting for the same benefits for others that I now enjoy. Let me freely express my discontent and desire for a better world, impelled by appreciation for what is beautiful and good in my life, and not to be shamed into silence by fear that I will be seen as just another whining, overindulged academic.

What the ignorant know

The Zen masters warn that when the cup is full the student cannot learn. In the same vein, Socrates described himself as wise only in the sense that he knew that he did not know. Ignorance, like the open space in a photograph, has far more constructive and creative power than we generally acknowledge. Predictably, though, universities, and the knowledge factories of popular culture, place far more emphasis on the acquisition and juxtaposition of facts and data than on expanding the gaps between what we confidently feel we know.

Still, most instructors have probably griped about students with overly full cups. We likely recognize a student’s reflexive confidence in her social, religious or political views as limiting and immature. We may even conclude that such rigid certainty correlates well with the student’s limited critical thinking skills. But, of course, it isn’t only, or even primarily, college students who can be parochially solipsistic. It’s a habit pretty much all of us fall into at least some of the time, though we may develop elaborate self-concepts and justificatory schemes that prevent us from noticing such unearned confidence.


In my own case, for example, the fact that I have been professionally and chronically focused on epistemological uncertainty — even my decades-old dissertation explored feminist critiques of objectivity — has not kept me from glomming onto paradigms and opinions with a tenacity they do not deserve. Like most people, I often leap from one apparent rock of certainty to another, reflexively avoiding the roiling water between for long stretches. In my case, though, and I think this is partly because I’m a philosopher in (voluntary) exile, I eventually judge my confident perches to be nearly as unsettling as the chasms below. I’m faced with questions, often from generously (or arrogantly) critical others, sometimes from life itself, that I become unwilling and unable to wave away.

It isn’t that I always abandon a cherished position in light of hard questions, but, rather, that such critiques can remind me of the fragility of ideologies as such, and of the intellectual and spiritual potential of epistemological humility. And in such moments of (wretched or blissful) uncertainty, I recall that, like a too-hot sauna or a deliciously assertive massage, uncertainty too is an experience I can sink into and savor. It’s a liminal zone in which my beliefs can come and go like weather, a place where concepts and opinions do not warrant or require my sycophantic allegiance. And it’s when I’m least certain about what I know that I feel myself to be a true intellectual, teacher, and spiritual traveler. Whenever I feel able to genuinely entertain that I may be wrong — and this does not happen as often as I would like — I know that I am on to something.


Of course, the implications for falling newly and repeatedly in love with ignorance are far reaching, in the classroom, the synagogue, and the cultural milieus of science and politics. It matters if there are gaps in peoples’ attachment to ideas about drugs, immigrants, fiscal policy, or how to interpret that iconic Frost poem. But it also matters in more personal contexts, for example, in relationships with other people. Just as it is a sort of sin against liberal education to stubbornly attach myself to ossified beliefs about reality, so too I am failing if I become addicted to my perceptions, opinions, and judgments about other people.

Once I have committed to an interpretive framework in which someone is stupid, fabulous, deluded, princely, or just plain evil, I will find all the evidence in the world to support my view. Everything that obtuse idiot (or paragon of virtue) says or does will be slotted into the preexisting interpretive boxes I have built. And while it may be a reliable boon to my own ego to have my beliefs about others magically and endlessly confirmed, it also guarantees that I will never truly encounter actual others. Rather, like the narcissist entranced by her own fantasy or nightmare, I will engage primarily with my own mental constructs, repeatedly finding objective proof of another’s sin or saintliness in projections of my own creation.

Professors who aspire to be teachers

Like nursing, teaching implies selfless maternalism. We imagine the underpaid young elementary school teacher, spending her weekends and salary to buy construction paper and flash cards, compelled, like a mother, from sheer devotion to her young charges. “Professor,” by contrast, is a decidedly manly word, and connotes, not service, but authority and expertise. Young people flock to sit at his feet, even if he is quirky and distant, because they admire him and are drawn to his genius. Is it any wonder that many professors balk at being referred to as mere “teachers”?

In a society that has long feminized, denigrated and devalued the teacher and that is now energetically denigrating and devaluing the professor, we have an ever more complicated relationship to these labels. Here on the Virtual Pedagogue, I regularly slip between “teacher,” “instructor,” and “professor,” not from sloppiness (usually), but because I resist solidifying my own self understanding into any one of these labels. I aim to use these terms intentionally, both to call attention to the similarities and differences among all of us who do this sort of work, and to subtly challenge stereotypes that surround them.

“Instructor” is perhaps the most generic and seems to apply to anyone habitually engaged in showing another how to do something, be it to fly a plane or solve quadratic equations. It’s a sterile word, without the ethical import of the other two, but can be useful when emphasizing functional commonalities, say, among teaching assistants, tenured professors, and high school coaches. Despite the trend of diminishing respect for higher ed, “professor” is a status word, weighed down by advanced degrees, heady scholarship, and a workload that may actually include no instruction whatsoever. Though “teacher” is, perhaps, the most common word, I also find it to be the most nuanced, rich and attractive.

Processed with Snapseed.

When I refer to those who’ve helped me change my life — for example, the passionate, brilliant women with whom I studied yoga in Minnesota — I call them my teachers. It’s one way I (lovingly) highlight that I didn’t primarily learn facts or strategies from them, but, rather, was supported in developing my whole self. So, when at some jagged point in my own pedagogical career I felt called to work more holistically with my students, I experienced a dramatic shift of consciousness and my labor became both more humble and momentous. It was, I determined, my serious and joyful responsibility to support students on their human journey while disguised as a feminist philosophy professor discussing Kant.

I am, then, both despite and partly because of its feminized humility, quite taken with the term “teacher,” though I appreciate the other ones too. When I get my hair cut at a new salon, I answer the “what do you do?” question with “professor.” I am happy to help dispel stereotypes about women’s work by claiming the full measure of my teetering professional status. But in the realest beating heart of my life, I am happiest and proudest being a teacher, sitting alongside my students, trying to find even one small way that our time together might make us all more inquisitive, daring, and demanding of ourselves and one another.