Pandemic 2020: Are universities treating the disease but killing the patient?

The virus in our midst is especially deadly for those with existing underlying health risks, though most of us who develop symptoms will eventually emerge relatively unscathed. So too, though nearly all U.S. colleges and universities are being touched by the pandemic, only some have already died or are languishing on life support. And while financial ruin may be what actually kills some, for far too many others, the true cause of death will be their failure to respond ethically and sustainably to the crisis rather than the crisis itself. Predictably, universities built on a genuine foundation of equity, respect, and sustainability are likely to survive and, eventually, to thrive, while those infused with hierarchy, secrecy, and reactivity began to teeter and crumble almost the very day that stay-at-home orders went into effect.

To take one example, my university has been at the national forefront with respect to cutting personnel and planning radical restructuring schemes, changes that may permanently reshape the university’s instructional and research capability, despite the unknown long term impact of the crisis. It was as if, one day, an abstract budget target number appeared in the sky, like the bat signal, with provosts and deans rushing to create and implement slash and burn plans that have included little or no substantive input from faculty, staff, or students. Whether my institution, Western Michigan University, is actually financially worse off than universities that have taken a more measured, holistic, stakeholder-based approach, I cannot say. As is (not incidentally) the case with many large, hierarchically run organizations, budgets are often so complicated and opaque that it can be impossible to separate fact from fiction, accuracy from exaggeration.

OLYMPUS DIGITAL CAMERA

Though the actual underlying rational for the budget targets may be murky, the impact on our campus community has been clear: early layoffs of hundreds, with, we are told, many more to come soon. In addition, we are seeing the early signs of marginalization, merger, or elimination of academic departments, according to factors that seem to have nothing to do with metrics of productivity or cost-benefit analyses. For example, my university has also been among the first in the nation to take steps to merge or eliminate its successful and cheap Gender and Women’s Studies Department (my tenure home) which focuses on LGBTQ youth, students of color, women, and sexual assault survivors. It is the sort of move that, though sadly unsurprising to those who do diversity work, has recently earned the condemnation of the National Women’s Studies Association for its apparent opportunism. NWSA chides universities “that are using the crisis to implement cuts they have been unable to make in the past because of faculty and student opposition and organizing.”

The true devastation of top-down ad hoc slash and burn policies isn’t primarily the intrinsic suffering they cause to campus constituents, especially to students, but the fact that this suffering is inflicted from above, for arbitrary or implausible reasons, and that it falls so disproportionately on the most vulnerable. Though, to be sure, elite administrators have agreed to symbolic cuts to their sometimes jaw-droppingly high salaries, their lives and livelihoods remain largely untouched. It is as if there had been a treaty signed by administrators from the outset that, whatever cuts and restructuring might occur, the university’s existing salary inequities, power structures, and habitual priorities must remain unchallenged. Rather than serving as an opportunity to reaffirm our commitments to compassion and equity, at too many organizations, this crisis is being leveraged to further erode such values.

OLYMPUS DIGITAL CAMERA

It is not higher education’s financial crisis alone that will ultimately determine which universities will live and which will languish and eventually die. Consider, for example, two equally resourced families of equal size, faced with the same dire economic news. The patriarchal head of the first family sighs regretfully and immediately drowns two of his children in the well. Problem solved! Fewer mouths to feed! The second family, though, sits down together to study the situation. Are these numbers accurate? Is there any way to challenge the apparent facts of the new reality? If not, are there parts of the family budget that can be trimmed to free up more money for necessities? Because this second family’s approach is values-based, participatory, and deliberate, they are better off even if, ultimately, their shared suffering is great. For one thing, their children, though skinnier, will probably still be alive to help harvest the fields when, one day, the corn grows again. But also, having remained true to their core values, the heart and soul of this family remain intact.

Those who rush to drown their children, or otherwise sell their souls to the devil to address a crisis — no matter how apparently grave — have already lost the war against this latest scourge. When the dust clears, we may celebrate the buildings and privileged employees still left standing, but we will also speculate about what transpired behind closed doors and secret meetings in exchange for such physical survival. In the long aftermath of catastrophe, we will ask: Who collaborated and who resisted? Who stood by and watched while others were pitched into the well? Such a climate of suspicion and resentment will stain our universities well into the future, undermining our will to inspire students or give our best to our disciplines or institutions. The problem universities are facing right now, then, is a financial one, to be sure, but it is just as much an ethical one. We are in the midst of a moral trial that invites us to peek behind the facade of classrooms, cafeterias and football stadiums to find out who we really are. And it does us not one bit of good, if, in a frenzy to keep the patient’s body alive, we kill the very thing that makes that life worth preserving in the first place.

Pandemic 2020: Let the university hunger games begin!

Does anyone wax longer or louder about respect, transparency, diversity, and equality than university presidents, provosts, and deans? For decades, at commencements, convocations, retirement ceremonies, and ribbon cuttings, we have been serenaded by one misty-eyed official after another reminding us of the unutterably precious value of our unique voices. These are not just pretty words, we have long been assured, but values rooted deeply in the shared governance structures that underlie our universities in the form of faculty senates, collective bargaining units, and enough faculty committees to make our heads spin. Our universities, with their enlightened and compassionate leaders, their egalitarian and rational decision-making processes, are oases in the midst of the nation’s MAGA barbarity, right? Sure, we have our ethical challenges, but no one can question the basic decency of our institutions, can they? No wonder it has been a shock for many of us that the moment times got really tough, some of our universities set out to stage their very own hunger games.

The premise is simple enough: A powerful, centralized oligarchy forces subjects to “volunteer” for an elaborate killing game intended both to solidify dependence and obedience, and to entertain the elites. Not only are subjects compelled to send their children into these orchestrated killing fields year after year, but they are expected to do so willingly, to dress up, smile, and join in the festivities surrounding the games. They are required not only to surrender their lives, then, but their own consciences and voices of protest as well. As deadly as the games are, their larger purpose has more to do with killing peoples’ spirits than their bodies. Though I have read lots of dystopian novels, I was especially moved by this aspect of The Hunger Games when I finally got around to reading it a few months ago. I could not shake the image of otherwise proud people coerced by artificially induced scarcity into killing one another while pampered elites looked on, sipping champagne and placing bets on who would be left standing at the end.

I was primed by my reading of The Hunger Games, then, to pay special attention when my institution, Western Michigan University, began listing and picking off its “non-essential” employees just a few weeks into the pandemic crisis, the first of many devastating personnel decisions that have emerged since. Hundreds of “expendable” employees have now been laid off and hundreds more have been told to expect our marching orders in the coming weeks, according to lists that have already been compiled and are being scrutinized by other inner-circle administrators behind tightly closed doors. Carefully choreographed, stylized messaging from presidents, provosts, and deans insists that this is all necessary for the good of the whole, and that we must do our duty and somberly accept these edicts. After all, these decisions have not been easy. In fact, they have kept the president up at night and been heartbreaking for the deans. Can’t we see the terrible position they are in, under extraordinary pressure from even higher ups, huddled in their private chambers, compiling human elimination lists to be shared with us when they’ve decided it’s the right time for us to know?

As with the hunger games of fiction, the damage here isn’t only to people’s lives and livelihoods, but to their hearts and minds. We, the remaining subjects of this newly authoritarian realm, are expected not just to live with whatever decisions spew forth from our “leaders,” but to get on board. In the spirit of shared sacrifice, we are expected to return as cheerleaders for our university in the Fall once the bodies of our faculty and staff colleagues have been cleared away. After all, didn’t the president and deans themselves accept voluntary pay cuts of five or ten percent? Well, no, those symbolically small cuts haven’t actually gone into effect yet but they will in a few months. You know, probably. Meanwhile, like the traumatized subjects of the eleven districts outside the pampered capitol city, we remaining university faculty and staff whisper among ourselves, knowing we should speak up, but terrified that it might be our own head next on the chopping block.

For example, though I belong to one of the most “protected” employee groups on campus, I assume that the letters of concern I sent recently to administrators have placed my career in even greater danger. After all, their decisions to not even acknowledge my messages were surely not intended to reassure me that my voice is still needed at this university, if, in fact, it ever was. And though I know, as we all do, that these administrators are, themselves, being pressed by even higher level “bosses,” this does not erase their basic ethical responsibility to me and the other faculty and staff entrusted to their stewardship. Partly because so many professors routinely remind our students that “just following orders” is a poor excuse, we have a hard time buying this when it comes from our intelligent, remarkably well-compensated, administrators.

It isn’t just those who have drunk the Kool-Aid who are now apologists for these clearcutting sprees by administrators desperate to meet budget targets based on rationales from higher up so obscure that even they themselves may barely understand them. As is nearly always the case with systematic injustice, elite administrators must leverage longstanding inequities between employees to meet their goals. At universities, there is often a sort of petty bourgeoisie of middle managers who help rationalize elite excess and soften resistance from below. Such complicity and accommodationism is critical because it helps obscure the fact that there is no actual necessity to the cruelty unfolding on our campuses. Our very real budget crises don’t require us to suddenly devolve into a Game of Thrones bloodbath. For example, my colleague, Charlie Kurth, describes a progressive furlough approach that could help us weather this situation and emerge even stronger in our fundamental social justice values than before. But try sharing these more progressive, compassionate, egalitarian strategies with your university administrators. Their responses, or lack of them, may be the quickest way possible to learn what, deep down, this horrific spectacle we’re being required to enact is really all about.

Professors in the pandemic: The painful truth about how much universities actually value teaching and learning

At the university where I work, the directives and decisions trickling from on high are dire and draconian. Even the best budget forecasts present a grim scenario. We must all sacrifice. The viability of our institution, and of higher education itself, depend on our ability to make anguishing choices now. I do not doubt the urgency of current circumstances, but when I talk to faculty colleagues at my university and across the nation, we’re asking the same question as always: When it comes time to hack and saw at university budgets, why do so many institutions fail so utterly to prioritize academics?

Because the academic function of higher education has faced amputations for years, faculty are now perfectly primed to ask: Why do supposedly non-essential extras — including unprofitable, wildly expensive Division I sports programs — seem always to rise higher on the safe list than the instructors, advisors, and support staff that make teaching and learning possible? University responses to the pandemic, including cuts to instructional staff, rub salt into a long festering wound as, once again, athletic programs and administrative excess are mostly left off the table.

It should hardly be surprising that, in a nation that has long nursed anti-intellectual resentments, the academic portion of universities has been portrayed as the real drain on university budgets. After all, conservative extremism has managed to vilify public school teachers while celebrating greedy billionaires, so it’s hardly a challenge to scapegoat supposedly whiny, entitled professors. When times get tough, then, it has become quite natural for university administrations to penalize those closest to the academic mission. Of course, in addition to being steeped in the same anti-intellectual miasma that has gripped much of the nation for decades, administrations often face extraordinary pressure from football-loving conservative governing boards to “trim the fat.”

OLYMPUS DIGITAL CAMERA

Amid all the apparently self-evident calls for sacrifice, how easy it is to forget that university science training and labs make it possible to study and treat disease. And that it is years of university study that has permitted us to model and predict epidemics, to properly use ventilators, manage critical supply chains, to respond rationally to economic crisis, and to rebuild urban and rural infrastructure. So too, our research and teaching help our society refine its understanding of social and political evils, for example, white nationalism, environmental racism, structural inequality, and the like. In addition, focused work in creative fields has expanded human sensitivity and imagination, helping us to envision innovative futures and to honestly and courageously face the human condition, in both its beauty and horror.

While many will applaud this laundry list of why universities matter, when it comes time for sacrifice, where will the knife actually fall? To quote a wise old friend: “The boyfriend who tells you he loves you, but treats you like an afterthought or burden, doesn’t love you.” As devastating as this pandemic is, then, it’s also an opportunity to revisit questions about core university values and priorities. And when we examine our institutions, let’s bypass their high flown mission statements and elaborate strategic plans. Let’s ignore the pretty rhetoric of chancellors, presidents, provosts, and deans altogether. This is a terrible time in many respects, but it is the very, very best time to discover how much we’re actually worth to well-paid administrators who have been serenading us for years with assurances of how much we, and our departments, matter.

I do not think there is a single faculty member, advisor, or librarian who expects to be exempted from the consequences of this crisis. But we are also keenly aware of who has been marked as safe, and the order of those being pushed down the gang plank. Under cover of urgency, universities will, no doubt, succeed to some degree at fulfilling longstanding budgetary wishlists, e.g., reducing “academic bloat” through reorganization and elimination schemes they’ve fantasized about for years. Whatever happens next, though, may we never forget that we are seeing the truth that lies beyond the rhetoric. Each time you drive by your university’s two-million-dollar football scoreboard, remember that bad boyfriend, the one who insisted you were his sun and moon but could never manage to remember your birthday.

Entitled and out of touch: The danger of anti-professor stereotypes in the pandemic

The stereotype of university professors as entitled babies who are oblivious to the “real world” takes on new urgency as the pandemic rages. Encouraged for decades by well funded conservative extremists, it’s become pretty standard for pundits and politicians to dismiss professors as spoiled, elitist, and selfish. Not surprisingly, it’s a stereotype that many university functionaries, including administrators, have accepted as well. Worse still, some professors have themselves come to internalize it, thereby discouraged from asking questions about anything “administrative,” including apparently hasty top-down decisions that may bypass our contracts or cripple our institutions’ academic viability.

For decades, then, professors have been getting the message that they are barely tolerated by many in the state capitol, and by variously titled chairs, deans, provosts, and presidents who, increasingly, assert their own managerial identities by differentiating themselves from us. Faculty members who are occasionally privy to administrative conversations often express surprise and distaste at the degree to which supposed faculty obliviousness and incompetence feature. It starts to seem as if many administrative-types don’t merely believe anti-faculty stereotypes but also bond with one another over them. There is perhaps no more effective way for rookie administrators to perform their new bureaucratic identity than to join in the familiar banter about impractical, coddled, and lazy faculty.

In the midst of higher education’s pandemic response, then, is it any wonder so many university administrations plow ahead with critical decisions, making little effort to substantively collaborate with faculty? After all, haven’t professors exempted themselves from the right to participate by virtue of being self-exiled prima donnas who care far more about their arcane research than balance sheets or the public good? Is it any wonder that even those of us who are the object of these stereotypes may still feel shamed and silenced by them? “Maybe it’s true,” we may think. “Perhaps a professor of English (or geography or music or mathematics) has no business speaking up given the life and death urgency of the moment.”

Except, of course, that the dismissal of professors’ voices is mostly based on an impressive pile of half-truth and hooey. Yes, some small percentage of U.S. professors come from elite backgrounds, land plum positions, and go on to live and work in “splendid isolation from the world.” In most cases, though, professors are actual flesh-and-blood people. Often, we have taken on staggering student loan debt and struggled for years, working as waitresses, census takers and retail clerks in the increasingly desperate hope of snagging tenure-track positions at humble regional universities in Pennsylvania or Ohio or Kentucky.

When we join these institutions, we are required to fully immerse ourselves in increasingly bureaucratic university service, provide individual attention to understandably beleaguered students, and research and publish in our areas of academic expertise, many of which are not arcane in the least. We spend our workdays teaching, lobbying for critical research equipment, making cold calls to prospective students, working through piles of accreditation forms, and writing tons of student recommendation letters. This, mind you, is if we are one of the lucky ones. For the majority of instructors, who are adjuncts or otherwise undervalued academic laborers, work demands and anxieties are usually far greater.

Only vanishingly few of us, then, ever catch a glimpse of anything resembling an ivory tower into which we might retreat with quill and parchment while kingdoms rise and fall around us. We are, rather, members of the communities in which we live, often small towns where we buy our groceries, fall in love, get mammograms, and send our children to school. We anguish along with our neighbors about gun violence, climate change, access to medical care, and the opportunistic fascism and viral pathogens sweeping through our nation.

Yes, the vast majority of instructors in higher education are privileged by race and class, a reflection of the unacceptable stratification that deforms all of U.S. culture and society, and not just higher education. Only when compared to the most shamelessly exploited members of society — especially the essential service workers now required to put their lives at risk for peanuts — do professors, as a whole class, appear to be an especially entitled, elite group. It is no accident that, with respect to pay, status, and the other factors that insulate a group from the pains of the world, professors are rarely compared by critics to CEOs, hedge fund managers, or even university administrators. Evidently, there is something especially appealing and effective about scapegoating professors and other educators for the hideous erosion of the American middle class.

It has long been clear that U.S. professors have been targeted for derision and elimination by conservative extremists. Just as evident is the fact that anti-professor stereotypes are rooted in the assumption that, while folks in private business, technology, medicine, entertainment, and sports might deserve some degree of prestige and pay, professors and K-12 teachers generally do not. This is in no small measure a result of concerted conservative efforts to exploit the longstanding American love affair with anti-intellectualism. In the U.S., it seems, it has never been especially difficult for unscrupulous plutocrats to funnel populist outrage toward books and those who love them.

But the tensions and exigencies of the pandemic make it ever clearer that it’s not just conservative extremists who use stereotypes to justify vilifying and marginalizing professors. It is also a growing cadre of professionalized university bureaucrats for whom professors’ supposed impracticality and pampered entitlement rationalize our exclusion from critical decision-making. At best the scenario that unfolds in one in which faculty are hapless children with wise and benevolent parents. At worst, we are self-centered nincompoops who must be flattered and manipulated into accepting policies that we have had no voice in creating. If, in the midst of crisis, we consent to such treatment — perhaps persuading ourselves that university administrators really do know best — will we ever again be allowed to sit at the big table with the grown ups?

Professors in the pandemic: Getting intimate with our fears about online education

When I originally began The Virtual Pedagogue some years ago it was to explore my own ambivalence about teaching online. Though the circumstances were far less dramatic than the crisis we now face, my initial experience as an online teacher fifteen or so years ago was also rushed and born of necessity. Predictably, it left such a bad taste in my mouth that it wasn’t til many years later that I felt any inclination to dip my toes in those waters again. Happily, my more recent experiences were far more positive and, over the past five years, I’ve taught many of my courses online while also reflecting on my experience in papers like this, in workshops with colleagues, and here on The Virtual Pedagogue. With most instruction now being pushed online, this seems like a good time to reconsider issues I’ve been ruminating about for a while from my limited perspective as a tenured, mid-career liberal arts faculty member. Not surprisingly, most of my concerns have turned out to be reducible to fear, in one form or another, which does not, of course, make them any less legitimate.

The first fear is systemic. In fact, it is huge. It is that, in agreeing to teach online, we are participating in a fast-food model of education that enables crass corporatism and hastens the demise of our brick and mortar institutions. As I discuss in many places here on the VP, there is, undeniably, cause for concern, but I see it less as a function of the technological shift than of the extreme inequality shaping higher education in the U.S.. To be sure, online education must not become the default modality for the poor while privileged students and faculty at elite institutions continue to hold debates in lovely ivory towers. The challenge is real and entrenched given that, for many vulnerable students, who may have multiple jobs, mental or physical disability, and child or elder care responsibilities, online classes are the only feasible access point to college. Though it may be tempting to identify online education as the culprit, then, the real enemy is even more daunting: structural barriers that fundamentally limit the options that students have about the kind of educational experience they will have.

IMG_0635

Especially for more senior faculty members like me, online educational technology itself can also be intimidating, especially given the proliferation of auxiliary bells and whistles that we may feel pressured to include in our classes. Many of us know what it’s like to have been brought to our knees by a computer program at some point — be it Quickbooks, Photoshop, or our university’s online advising system — and we may have little inclination to seek out more such demoralizing experiences. This may be especially true with respect to teaching which, for some of us, may be the one arena in which we feel utterly competent.

It is undoubtedly true that poorly utilized online technology can be clunky and unwieldy, serving to distract more than to enable learning. But if one focuses on the basics — and what this means will vary a lot from discipline to discipline — it is no more intrinsically difficult than other programs or apps that most of us routinely use, for example, while we shop, communicate with long-distance grandchildren, or download audiobooks from our public library. And though some learning discomfort is unavoidable, anyone who still refuses to engage with online technology at all — even to supplement their courses — is, at this point, more like that telephone-averse butler on Downton Abbey than a hero fighting for traditional education. As time and technology march inexorably onward, at some point one becomes less of a lovable curmudgeon and more of a cranky Luddite.

IMG_4916

Perhaps the most insidious fear, and the one I explore most frequently here on The Virtual Pedagogue, is the threat that online teaching can represent to our deepest identities as competent, respected, valued professionals. Though it’s not something we professors usually like to admit, there can be tremendous ego satisfaction in traditional face-to-face classroom performance. After all, we have been assigned the featured role in a pedagogical drama, one that many of us have, over decades, honed to perfection. It is no wonder that many of us have come to relish and rely upon the adoring faces of students as they bask in our brilliance.

How often, when we extol the “fire,” “energy,” and “magic” of the classroom, might we actually be referring to the ego satisfaction that we ourselves derive from students’ attention and praise? I think this is not necessarily because we are shallow or narcissistic, but, rather, a perhaps inevitable consequence of engaging in this sort of intensely human labor. For many instructors, the physical university, with its hallowed halls and ivory towers, is a beloved backdrop that allows us to enact hard-won, lovingly cultivated identities that seem to require the nurturing attention of students. The loss of that sea of shining faces can feel like an erasure of our professorial identity altogether, as though we have been replaced by a mere machine.

OLYMPUS DIGITAL CAMERA

While there are, of course, lots of good reasons for prioritizing face-to-face education — I will never write a love letter to online only institutions — it is critically important to get deeply honest, especially with ourselves, about what, precisely, our fears and misgivings are about online education. This is especially urgent now that, for most of us, online teaching has suddenly become an unavoidable reality rather than a mere pedagogical possibility or abstraction. To be sure, some of our complaints about online education may turn out to be intrinsic weaknesses of the online modality itself, but some, surely, are based on other fears and anxieties.

How much of our discomfort about online education is really about our anger, fear and sorrow over economic injustice, anti-intellectualism, public disinvestment in higher education, and the radical communication shifts that have fundamentally reshaped human relationships and institutions? Whatever happens next in the development of universities’ relationship to online education — and this is a train that left the station long ago — faculty must be in the driver’s seat. But we cannot guide this process wisely and effectively if we are not relentlessly honest with ourselves about where our fears and misgivings about it lie.

Below are links to a few of the many posts on this site that explore questions about online education:
Are online classes the fast food of higher ed?

Are online teachers lazy sellouts?

Is anybody out there? The loneliness of the online teacher

Telling the truth about online education

The sweet ego boost of teaching face-to-face

Plunging into online teaching: It’s not what I thought it would be

Online teaching: The joy of tedious planning

Could online teaching be a path to enlightenment?

 

 

Plunging into Online Teaching: It’s not what I thought it would be

The first time I taught online was over a decade ago when I got pulled in like a tug of war contestant into a mud pit. A mid-career philosophy professor, I was a good teacher, a popular teacher, content with my pedagogical approach and buoyed by the energy of the face-to-face classroom.

I approached the challenge of online teaching like a translation problem: how to interpret my existing course into a virtual one. Back then there weren’t many online education resources to save me from this error, but even if there had been, I doubt I would have paid much attention. My real weakness was that I didn’t fully get that my classroom teaching represented a particular modality, one with its own accidental logic and underlying values. I couldn’t fundamentally rethink my strategy — lecture, discuss, exam, repeat — because it all seemed too basic and fundamental to deeply question. It’s no surprise, then, that this first foray into the virtual classroom was less than successful. I left with my ego bruised, feeling bad for my students, and resentful that I’d been nudged into participating.

OLYMPUS DIGITAL CAMERA

Fast forward and I am now deeply immersed in online teaching. Instead of fighting the waves, and tightening my grip on long-standing pedagogical habits and commitments, I am beginning to relax into the unfamiliarity of it. I can accept, at least sometimes, that this is not merely a shadow version of being a “real professor,” but, rather, a fundamentally different enterprise. I had been like the traveler unable to appreciate new vistas until she recognizes the biases she carries with her. I couldn’t see what online teaching had to offer until I could view my traditional teaching values and practices from a distance. At some point, I began to recognize my habitual way of teaching as involving particular, and changeable, assumptions, values and strategies. I still hold onto some of my traditional ways, and there are others whose loss I will probably always mourn. But for all of that, I am moving forward.

img_0186

I won’t sugarcoat this. My experiences with online teaching and my feelings about it are complicated. But the project of engaging with it is one that has transformed not just my teaching, but also my relationship to change itself. In ways I painstakingly explore in this blog, I am not only a better online teacher than I used to be, but I think I’m a better teacher period. Certainly, I am less ego-focused, less change-averse, and less nostalgic than I used to be. While I’m not an uncritical cheerleader for online education — I still rail against its worst tendencies — I have warmed to it enough so that it is working for me and my students. And even if I never taught another online class, I would still be enriched from having looked back on my pedagogical values and commitments from the shore of this new virtual land.

Mission critical thinking: Preparing students and ourselves for catastrophic times

Most liberal arts professors have known for years that the greatest good we can do for many of our students probably isn’t to immerse them in the advanced esoterica of our particular disciplines but to help develop their critical reading, writing and thinking skills. In the disastrous age of MAGA, I have begun to more fully appreciate this lesson: Part of my job is to help prepare students to locate and respond to catastrophic social, political, and ethical problems, only some of which we are now even able to fully imagine.

“Critical thinking,” that darling term we educators have been kissing and cuddling for decades, no longer cuts it when we face the full horror and possibility of what we are collectively facing. In past decades, “critical” has signaled ways of thinking, reading, and writing that occur from a questioning and investigative mode, a disinterested evaluation of facts, logical relationships between claims, and the biases of all concerned, including oneself. This is all to the good, especially the importance of challenging claims that happen to suit one’s preexisting expectations or preferences. Certainly, we would all be much better off if “critical thinking” of this sort could dislodge the irrational mob-think and craven consumerist claptrap that passes for much of current social and political discourse.

Teaching critical analysis as a fairly narrowly cognitive approach is evidently not enough, though. What we need is a reclamation of “critical” that is bolder, more dramatic, and far more socially and emotionally urgent than any we may have ever before used. In short, we must train students and ourselves to function as intellectual and psychological EMTs, prepared to move into the disaster zone with the skills, judgment, and nerve necessary for both triage and long-term, sustainable healing and repair. We need proactive, brave, pliable first responders who are also long-term strategic solution-seekers capable of evaluating and rearrange the big picture. The “critical thinking” values that must underlie our teaching work today are “critical” in the sense of “mission critical” and of “critical condition.” The symbol for this might include a pen and inkwell, but also a blood red armband and a sturdy multi-tool.

OLYMPUS DIGITAL CAMERA

This more urgent, red-alert version of critical thinking obviously must include much of what has always mattered about this traditional skillset, including close reading, basic logic, the analysis of evidence, and evaluation of perspective. But it must place greater explicit emphasis on qualities of individual motivation, self-care and character development, including the cultivation of:
– a healthy combination of confidence, humility, self-efficacy and self-reflection
– an unwavering commitment to empathy and compassion that does not slide into paternalistic pity or overwhelmed quietism
– a bias toward positive, productive action in the service of deep communal values, including for example, participatory democracy and racial equality
– an ability to make tough, real-world decisions in the face of incomplete information and general uncertainty
– the courage to go against the grain, to swim upstream from groupthink while still respecting the legitimate needs of the community

Even this cursory, general list serves as a cautionary guide for me: As a feminist philosopher, I have for decades emphasized a cognitively based, moderate notion of critical thinking that has reflected both a (perhaps naive) confidence in human reason and a (legitimate) concern about alienating students. I have, then, often ended up focusing on tweaking reading, writing and thinking skills, careful not to be “too normative” or “too directive” with respect to the social and emotional values surrounding these supposedly “neutral” cognitive standards. I haven’t avoided real world issues — this would not even possible in the courses I teach — but I have sometimes highlighted the intellectual “toolbox” aspect of critical thinking in order to sidestep the messier social and ethical facets that give cognitive values sense and power.

IMG_3516

For better and worse, I know that I am not the only instructor who has been dancing carefully among the demanding arms of cognitive, emotional, social and ethical competence. Unfortunately, there is extraordinary pressure on professors to treat students like desperately needed, precious, fickle, customers. Further, the long, determined march from tenured to contingent faculty has eroded the secure ground from which some faculty can be expected to engage in difficult dialogues. It is surely no accident that the academic freedom necessary to engage in authentically holistic critical thinking has been hacked away by conservative extremists at the very time it is most urgently needed. Regardless, we can no longer afford any semblance of the fantasy that liberal arts professors are debate coaches meant to lead students through “what if” puzzles to achieve oblique insights or incrementally improved logical skills. The most privileged of professors, at least, surely, might rethink our relationship to “critical thinking.”

So, though I still push my students to wrestle constructively, directly and intellectually with texts — this humanistic work matters! — I engage with them in ever more practical, particular, personal, and socially urgent terms. And I am more prepared than ever to acknowledge my astonishing ignorance, because, like so many well trained, smart professors, I have been caught off guard by the scale and doggedness of the retrograde cruelty and naked greed of conservative extremists. And so I commit as much to the pedagogical power of empathy, ethical sensitivity, and self-empowerment as to more specifically cognitive values. This isn’t a self-esteem based pedagogical gimmick, but, instead, a matter of necessity: It will take the empowered, compassionate, creative strategizing of all of us — young and old — to MacGyver our way out of this mess.

Super Mario in a one-room schoolhouse: The myth of a singular college experience

I have mastered my shield and sword become familiar with the labyrinth. More confident than ever, I sneak up behind an ogre, weapon drawn. But in the split second before I strike, the creature steps backward, knocking me into a chasm I’d taken great care to sidestep. The fizzling, “game over” music that accompanies my death mocks me. I have been hacked, zapped, and crushed to death, and, each time, I have tried again, determined to complete this sequence. This time, though, I save and quit, eager to play something easier. But five minutes into the “relaxing” tedium of a new game in which I scoop up gems while summarily dispatching lethargic foes, I have had it. I have gone from feeling demoralized by the challenges of the first game to annoyed by the childish ease of the second.

My fickle petulance in the face of such shifting levels of challenge invites me to think about the critical role that “appropriate difficulty” has in creating satisfyingly rich learning experiences in general. Of course, successful video game designers have mastered the nuances of manipulating obstacles, rewards and pacing to create engaging challenges. They know how to offer guidance that does not devolve into handholding, and small, consistent rewards along the way such as new weapons or abilities. In short, they create a world in which patient hard work will be rewarded.Though they may sometimes be very difficult, these challenges still feel ultimately fair. Because conscientious video game designers must so closely consider individual user engagement, they can provide key insights for instructors and students of all sorts. How many of us have stewed in the frustration of classes that felt rudimentary and plodding? And haven’t we also been left floundering in our own stupidity by courses pitched too far over our heads?

As a professor at an increasingly open access, mid-tier public university, calibrating difficulty is a task I find more daunting each year. While my strongest students’ level of preparation seems to be about the same as always, the college-readiness of everyone else is more and more of a mixed bag. My introductory classes are a motley blend of motivated readers, writers, and problem solvers combined with folks who lack basic skills, resources, and persistence. In recent years I have even begun thinking of myself as a plucky teacher in a one-room rural schoolhouse, charged with simultaneously facilitating grades K-12. I must stoke the fire and help the young’uns learn their letters while still ensuring that the older kids are pushing through their geometry problems. In short, I must be sensitive to individual ability and opportunity but in a fairly uniform environment.

It’s a principle that seems to underlie successful video game design as well in that they are typically aimed at cultivating individual interests and abilities, focusing on self-paced success and exploration. Games with mass appeal create a single world in which noobs can progress in their dawdling way while hard core gamers leap along, experiencing facets of play of which novices might never even become aware. In short, it is the layers of possibilities for individuals — of both reward and frustration — that allow one and the same gaming experience to be appropriately challenging and satisfying to a wide range of players. Such game design is possible only because no one is pretending that players will, should, or could leave with the same “results” or rewards; certainly, the success of the game does not depend on all players gleaning the same “benefits.”

By contrast, the notion persists that college classrooms can and should aim for the same reproducible outcome for each student, though this goal has perhaps never been more elusive at non-selective publics. And, though, of course it has always been the case that individual learners’ outcomes vary wildly, universities have also continued to prioritize assessment methods that treat our classes functionally and our students as interchangeable variables. The professor’s success continues, by and large, to be measured by the degree to which she impacts students across a narrow set of uniform assessment goals/outcomes despite the fact that professors at open access publics are increasingly being called upon to facilitate one-room schoolhouses.

Instead of continuing to pretend that there is one definition of college-readiness and a singular college experience, we would be better off acknowledging that, by and large, many of our college classes are, at best, like Super Mario Odyssey, a game that attracts and entertains a remarkable gamut of players, from small children, to bored subway commuters, to deadly serious gamers. A casual player with sluggish reflexes might while away many satisfying hours, exploring here, butt stomping there, but unlocking only a tiny fraction of the game’s secrets and leaving many of its rewards unclaimed. In a way, it may not even make sense to say that the noob and the skilled gamer are playing the “same game” though they are operating in the same facilitated virtual space.

To be sure, I am appalled that our public education system has been so stratified along economic class lines for so long that is a simple fact that lots of students arrive at college not at all what we like to call “college ready.” But even as we fight for saner, more egalitarian K-12 public education policies, we must deal with the astonishing mix of abilities, motivations, and resources streaming into our college classrooms. After all, our universities have a pretty good idea what these students’ capabilities are and have accepted their tuition payments, invited them in, and made lots of promises. Rather than wringing our hands over the impossibility of teaching across such a broad range of ability, maybe we can imagine new ways for Mario to progress, whether he bounds, rolls or crawls. The reality is that, whether I like it or not, I have been charged with lighting the wood stove, clapping the erasers, and preparing to die again and again and again.

Maybe it’s healthy to be ambivalent about online education

As I grow older, I’m better able to accept that living well requires making choices between imperfect alternatives. This more pragmatic orientation also feels more mature — think of a toddler who refuses any treat that falls short of ideal — and it also helps me appreciate how I’ve misused ambivalence in the past. As valuable and unavoidable as some ambivalence is, I now see that some of what I’d attributed to admirable, intellectually honest uncertainty probably had more to do with fear.

Of course there are different kinds of ambivalence and some matter more than others. For example, because I’m merely a coffee addict and not a connoisseur, when offered the choice between light or dark roast, I usually say “whichever’s freshest.” I’ve learned to say this rather than admit I don’t care because a bald expression of ambivalence can paralyze the cafe staff. Because they know and care about coffee, such naked ambivalence must seem irresponsible or disingenuous. “How can you not care?” they must be thinking.

img_0570

Ambivalence like this is pretty trivial unless the choice is thought to be expressive or constitutive of one’s identity, i.e., “I’m the kind of person who only wears black.” This is a kind of lifestyle identity politics that’s based on allying oneself with this kind of music, or clothing style, or football team rather than that one. When identity is, implicitly or explicitly, thought to be at issue then too much ambivalence can seem like a wishy-washy abdication of one’s very self.

Before I uneasily embraced online education, I was swirling in ambivalence that I couldn’t fully articulate. I was, in fact, more likely to voice my really substantive (ethical, political, social) misgivings about it than my more mundane concerns. In retrospect, though, I see that my merely practical worries drove my aversion to online teaching at least as much as my deeper misgivings: Would I be overwhelmed by the amount of work? Was I too set in my ways to master the technology? How would I meaningfully connect with students without the crutch of my charismatic schtick?

OLYMPUS DIGITAL CAMERA

My ambivalence about the substantive issues hasn’t really changed: I am still as deeply troubled by how online education enables an increasingly corporatist higher ed even as it provides invaluable access for some students. I still hate that I am contributing to a more impersonal, interchangeably modular, version of education, even as I am proud of my new efforts to engage with students in this flexible, open-ended virtual space.

My ambivalence is genuine and important, and I live with the tension of it as I more or less happily go about my online work. It is a low grade discomfort that informs my choices and practices but which does not disable me. Clearly, I did not need to wait until I had moved past my ambivalence to embrace online teaching, but nor did I need to pretend that those mixed feelings had been resolved. In fact, I think my ethical discomfort is healthy and points to problems within higher ed, a system with failings that, though I am implicated in them, also need to be reckoned with. It would be a disservice to my integrity and to my vocation if I were to paint my criticisms pink and become a mere cheerleader for online education.

On the other hand, I wonder where I would be headed had I remained aloof from online ed out of respect for my supposedly noble ambivalence. I am reminded of a former senior colleague who, in the early days of email, proudly refused to use it. He had all sorts of important, and probably legitimate, gripes: It was too impersonal, too ambiguous, too informal, and so on. But it was evident that his aversion was also rooted in his fear of being unable to master this new game, and being an anti-email crank came to define him. I’ve always hoped that his righteous confidence turned out to be warm company, because as email continued its inexorable march, he became increasingly isolated from his students and colleagues.

Gamification: Seductive gold stars and pats on the back

In the third grade, I was rewarded for being the fastest to complete a series of long division problems on the blackboard. My prize, a Flintstone’s eraser, wasn’t even a good likeness of Dino, but I carried it with me for weeks. These days the reward I crave is the happy jingle from my iPad when I’ve completed the daily New York Times crossword. My awareness that I’m only sort of joking when I admit it’s my favorite song helps explain my ambivalence at incorporating similarly trivial rewards into my own classes. Frankly, it’s a little embarrassing to be so eager for such superficial affirmations.

Gamification, using elements of reward and friendly competition to encourage effort and engagement, is both simple and intuitively appealing. That it effectively lights fires — at least in some learners — is clear enough. Nudged onward by the promise of leveling up or of earning a virtual ribbon, we do sometimes perform more diligently and enthusiastically with these dangling carrots in sight. And so I created a badge icon for students who improve their quiz scores, one that automatically pops up on these users’ home pages. I plan to add consistency and perseverance badges as I seek more ways to exploit these easily implemented gamification strategies.


I’ve become willing to experiment with such cheap tactics partly because of my own recent experience as an online student; I was surprised by the tiny thrills of satisfaction I came to anticipate as my badges appeared. And I suspect that gamification has a similarly primal effect, not only on millennial video gamers, but on many of us who earned prizes as children: for the number of books read, a class spelling bee, or a math club competition. But I also know that some experts caution against linking worthwhile activities to crass rewards, noting that, for example, children may no longer color for sheer enjoyment when prizes become part of the mix. While this consequence might not be so worrisome for straightforwardly “outcome-based” courses, it would be anathema for teachers intent on cultivating joyfully authentic life-practices such as close reading and thoughtful discussion.

So, even as I create the release conditions for my virtual badges, imagining my students’ pleasure at receiving them, I’m a little sheepish. Is this all just a tawdry gimmick? Am I trying to bribe these precious human companions with trivial ego boosts, coaxing them to learn material that, as it happens, actually has both intrinsic value and relevance to their lives? Am I reinforcing a consumerist, credentialist view of learning as merely extrinsically valuable, with grades and prizes to be collected in exchange for a diploma and job? They are urgent questions for me because I’ve never meant for my students merely, or even primarily, to learn “information” or discrete “skill sets” associated with my “content area.”

As I continue to explore using badges and other rewards, I remind myself that what I’m up to — leveraging behaviorist elements of learning without sacrificing the ethos of learning for its own sake — is a very old pedagogical conundrum. It certainly didn’t arise with online teaching, even if online modalities have made us more self-conscious about the perils and promises of gamification. In online classes, the affinity of gamification to electronic gaming becomes obvious. And, of course, we all know, or imagine we do, how addictive and empty that activity can be. But, again, some of my most enduring memories as an elementary school student in the 70’s, long before Super Mario or Minecraft, also involved “gamification.” And they are memories that, for better and worse, still bring me vibrations of shame and satisfaction.

As a child, I was motivated by the promise and fear of prizes awarded and withheld, but this probably also compromised my ability to take learning risks because I did not want to be a loser. Gamification, then, is complicated and fraught, and it occurs to me that I should use it more thoughtfully. What if, for example, I invited students to explicitly reflect upon their own perceived susceptibility or aversion to gold stars and pats on the back? Could gamification then become a tool for deeper self-reflection and whole-person development? After all, much of life occurs against a competitive backdrop, a humming swirl of conditional, often arbitrary, ego affirmations and insults. A little more awareness of what’s driving the quest for that promotion, that house, or that anti-wrinkle cream is probably not such a bad idea.