Covid 19 and the university: Professors are not Dorothy and the administration is not our Oz

Though the university is frequently characterized as a liberal hotbed, professors have always had to fight, sometimes even within our own ranks, for our right to speak up. This is especially so during times of national or global crisis when, predictably, efforts to silence supposed disgrunts may reach a fever pitch. Even at universities, and even within the professoriate, our habitual pleas for academic freedom and the need to be robust critical thinkers may fade. What’s more, it’s not unusual for those asking difficult questions to be scolded, smoothing the way for administrative overreach and excess.

Critics should expect to encounter efforts to silence them — both subtle and gross — culminating in accusations of disloyalty, to the institution, to the nation, even to humanity itself. These may begin as a gentle form of ostracism where the critic is simply ignored, even by those who suspect, or know, that the warning is more than just someone crying wolf. This passive strategy of shunning may escalate into more overt shaming, with squeaky wheels being called out for betrayal of the common good. Perhaps because I am a gender studies professor, I can never hear such admonishments outside the framework of the silencing politics of sexual violence. Keep it to yourself, the victim may be urged, or the police will come and take daddy away.

OLYMPUS DIGITAL CAMERA Processed with Snapseed.

Even basic questions of leadership competence and accountability may be automatically turned back on the critic, dismissed as potentially treasonous. When commanded to jump by a president, provost, or dean — some of whom until very recently were mere mortals, just professors like ourselves — otherwise staunch faculty advocates may now reflexive reply, “How high?” Obviously, this creates the perfect conditions for the most egregious forms of administrative overreach, especially when rumors are unleashed that employees will be lucky to have jobs come Fall. In the blink of an eye, proudly empowered members of the professoriate may be reduced to begging for scraps, perhaps volunteering to give back their salaries with no idea of what the financial exigencies actually are.

Too often, as a distraction during crisis times, difficult nuts and bolts conversations are bypassed, and, instead, we are urged by leaders “take deep breaths,” and “be grateful for what we’ve got.” In the service of compassion, privileged, tenure-line faculty who have relative job security, especially, may be urged to make “sacrifices.” Such humanistic values are, of course, well and good, but quickly turn sour when used to paint those who persist in demanding institutional accountability, or even rudimentary shared governance, as crass or unspiritual.

OLYMPUS DIGITAL CAMERA

Not incidentally, vague calls for sacrifice and compassion from the professoriate distract from the obvious and egregious economic disparities that we have long known exist between elite administrators and almost everyone else. Against this backdrop, the critically outspoken professor may still be painted as too privileged, naive, or narcissistic to appreciate the gravity of the situation. It is as if the horror of the fact that people are dying around the world — and that we all have a moral imperative to respond — somehow erases, rather than intensifies, our ongoing duty to think for ourselves and insist that our institution to live up to its basic commitments, including to campus employees far more vulnerable than most professors.

Professors’ special responsibility to be critical thinkers and outspoken members of our campus communities — including on behalf of our staff employee colleagues — surely doesn’t end because we are in the midst of crisis, regardless of what paternalistic higher ups or even terrorized colleagues may imply. If anything, the need for brave, questioning professorial voices is more urgent than ever and we must resist the temptation to glorify the authority or magical abilities of administrative colleagues as if we had suddenly been transformed into Dorothy and Toto, wandering haplessly in an unknown world.

As usual, there is a practical benefit to our continuing to behave as the flexible intellectuals, incisive social critics, and responsible, skeptical adults that we are. If we permit our fear to overtake us, and start behaving like dazed, frightened children, then we are inviting our presidents and provosts to function as decisive authoritarians, no matter how much (as is evidently the case) they may be flailing. Only with a collegial relationship based on mutual respect and fierce accountability can we both meet this crisis and also make it more likely that, together — faculty, students, staff, and administration — we will thrive in the aftermath.

Pandemic 2020: Are universities treating the disease but killing the patient?

The virus in our midst is especially deadly for those with existing underlying health risks, though most of us who develop symptoms will eventually emerge relatively unscathed. So too, though nearly all U.S. colleges and universities are being touched by the pandemic, only some have already died or are languishing on life support. And while financial ruin may be what actually kills some, for far too many others, the true cause of death will be their failure to respond ethically and sustainably to the crisis rather than the crisis itself. Predictably, universities built on a genuine foundation of equity, respect, and sustainability are likely to survive and, eventually, to thrive, while those infused with hierarchy, secrecy, and reactivity began to teeter and crumble almost the very day that stay-at-home orders went into effect.

To take one example, my university has been at the national forefront with respect to cutting personnel and planning radical restructuring schemes, changes that may permanently reshape the university’s instructional and research capability, despite the unknown long term impact of the crisis. It was as if, one day, an abstract budget target number appeared in the sky, like the bat signal, with provosts and deans rushing to create and implement slash and burn plans that have included little or no substantive input from faculty, staff, or students. Whether my institution, Western Michigan University, is actually financially worse off than universities that have taken a more measured, holistic, stakeholder-based approach, I cannot say. As is (not incidentally) the case with many large, hierarchically run organizations, budgets are often so complicated and opaque that it can be impossible to separate fact from fiction, accuracy from exaggeration.

OLYMPUS DIGITAL CAMERA

Though the actual underlying rational for the budget targets may be murky, the impact on our campus community has been clear: early layoffs of hundreds, with, we are told, many more to come soon. In addition, we are seeing the early signs of marginalization, merger, or elimination of academic departments, according to factors that seem to have nothing to do with metrics of productivity or cost-benefit analyses. For example, my university has also been among the first in the nation to take steps to merge or eliminate its successful and cheap Gender and Women’s Studies Department (my tenure home) which focuses on LGBTQ youth, students of color, women, and sexual assault survivors. It is the sort of move that, though sadly unsurprising to those who do diversity work, has recently earned the condemnation of the National Women’s Studies Association for its apparent opportunism. NWSA chides universities “that are using the crisis to implement cuts they have been unable to make in the past because of faculty and student opposition and organizing.”

The true devastation of top-down ad hoc slash and burn policies isn’t primarily the intrinsic suffering they cause to campus constituents, especially to students, but the fact that this suffering is inflicted from above, for arbitrary or implausible reasons, and that it falls so disproportionately on the most vulnerable. Though, to be sure, elite administrators have agreed to symbolic cuts to their sometimes jaw-droppingly high salaries, their lives and livelihoods remain largely untouched. It is as if there had been a treaty signed by administrators from the outset that, whatever cuts and restructuring might occur, the university’s existing salary inequities, power structures, and habitual priorities must remain unchallenged. Rather than serving as an opportunity to reaffirm our commitments to compassion and equity, at too many organizations, this crisis is being leveraged to further erode such values.

OLYMPUS DIGITAL CAMERA

It is not higher education’s financial crisis alone that will ultimately determine which universities will live and which will languish and eventually die. Consider, for example, two equally resourced families of equal size, faced with the same dire economic news. The patriarchal head of the first family sighs regretfully and immediately drowns two of his children in the well. Problem solved! Fewer mouths to feed! The second family, though, sits down together to study the situation. Are these numbers accurate? Is there any way to challenge the apparent facts of the new reality? If not, are there parts of the family budget that can be trimmed to free up more money for necessities? Because this second family’s approach is values-based, participatory, and deliberate, they are better off even if, ultimately, their shared suffering is great. For one thing, their children, though skinnier, will probably still be alive to help harvest the fields when, one day, the corn grows again. But also, having remained true to their core values, the heart and soul of this family remain intact.

Those who rush to drown their children, or otherwise sell their souls to the devil to address a crisis — no matter how apparently grave — have already lost the war against this latest scourge. When the dust clears, we may celebrate the buildings and privileged employees still left standing, but we will also speculate about what transpired behind closed doors and secret meetings in exchange for such physical survival. In the long aftermath of catastrophe, we will ask: Who collaborated and who resisted? Who stood by and watched while others were pitched into the well? Such a climate of suspicion and resentment will stain our universities well into the future, undermining our will to inspire students or give our best to our disciplines or institutions. The problem universities are facing right now, then, is a financial one, to be sure, but it is just as much an ethical one. We are in the midst of a moral trial that invites us to peek behind the facade of classrooms, cafeterias and football stadiums to find out who we really are. And it does us not one bit of good, if, in a frenzy to keep the patient’s body alive, we kill the very thing that makes that life worth preserving in the first place.

Pandemic 2020: Let the university hunger games begin!

Does anyone wax longer or louder about respect, transparency, diversity, and equality than university presidents, provosts, and deans? For decades, at commencements, convocations, retirement ceremonies, and ribbon cuttings, we have been serenaded by one misty-eyed official after another reminding us of the unutterably precious value of our unique voices. These are not just pretty words, we have long been assured, but values rooted deeply in the shared governance structures that underlie our universities in the form of faculty senates, collective bargaining units, and enough faculty committees to make our heads spin. Our universities, with their enlightened and compassionate leaders, their egalitarian and rational decision-making processes, are oases in the midst of the nation’s MAGA barbarity, right? Sure, we have our ethical challenges, but no one can question the basic decency of our institutions, can they? No wonder it has been a shock for many of us that the moment times got really tough, some of our universities set out to stage their very own hunger games.

The premise is simple enough: A powerful, centralized oligarchy forces subjects to “volunteer” for an elaborate killing game intended both to solidify dependence and obedience, and to entertain the elites. Not only are subjects compelled to send their children into these orchestrated killing fields year after year, but they are expected to do so willingly, to dress up, smile, and join in the festivities surrounding the games. They are required not only to surrender their lives, then, but their own consciences and voices of protest as well. As deadly as the games are, their larger purpose has more to do with killing peoples’ spirits than their bodies. Though I have read lots of dystopian novels, I was especially moved by this aspect of The Hunger Games when I finally got around to reading it a few months ago. I could not shake the image of otherwise proud people coerced by artificially induced scarcity into killing one another while pampered elites looked on, sipping champagne and placing bets on who would be left standing at the end.

I was primed by my reading of The Hunger Games, then, to pay special attention when my institution, Western Michigan University, began listing and picking off its “non-essential” employees just a few weeks into the pandemic crisis, the first of many devastating personnel decisions that have emerged since. Hundreds of “expendable” employees have now been laid off and hundreds more have been told to expect our marching orders in the coming weeks, according to lists that have already been compiled and are being scrutinized by other inner-circle administrators behind tightly closed doors. Carefully choreographed, stylized messaging from presidents, provosts, and deans insists that this is all necessary for the good of the whole, and that we must do our duty and somberly accept these edicts. After all, these decisions have not been easy. In fact, they have kept the president up at night and been heartbreaking for the deans. Can’t we see the terrible position they are in, under extraordinary pressure from even higher ups, huddled in their private chambers, compiling human elimination lists to be shared with us when they’ve decided it’s the right time for us to know?

As with the hunger games of fiction, the damage here isn’t only to people’s lives and livelihoods, but to their hearts and minds. We, the remaining subjects of this newly authoritarian realm, are expected not just to live with whatever decisions spew forth from our “leaders,” but to get on board. In the spirit of shared sacrifice, we are expected to return as cheerleaders for our university in the Fall once the bodies of our faculty and staff colleagues have been cleared away. After all, didn’t the president and deans themselves accept voluntary pay cuts of five or ten percent? Well, no, those symbolically small cuts haven’t actually gone into effect yet but they will in a few months. You know, probably. Meanwhile, like the traumatized subjects of the eleven districts outside the pampered capitol city, we remaining university faculty and staff whisper among ourselves, knowing we should speak up, but terrified that it might be our own head next on the chopping block.

For example, though I belong to one of the most “protected” employee groups on campus, I assume that the letters of concern I sent recently to administrators have placed my career in even greater danger. After all, their decisions to not even acknowledge my messages were surely not intended to reassure me that my voice is still needed at this university, if, in fact, it ever was. And though I know, as we all do, that these administrators are, themselves, being pressed by even higher level “bosses,” this does not erase their basic ethical responsibility to me and the other faculty and staff entrusted to their stewardship. Partly because so many professors routinely remind our students that “just following orders” is a poor excuse, we have a hard time buying this when it comes from our intelligent, remarkably well-compensated, administrators.

It isn’t just those who have drunk the Kool-Aid who are now apologists for these clearcutting sprees by administrators desperate to meet budget targets based on rationales from higher up so obscure that even they themselves may barely understand them. As is nearly always the case with systematic injustice, elite administrators must leverage longstanding inequities between employees to meet their goals. At universities, there is often a sort of petty bourgeoisie of middle managers who help rationalize elite excess and soften resistance from below. Such complicity and accommodationism is critical because it helps obscure the fact that there is no actual necessity to the cruelty unfolding on our campuses. Our very real budget crises don’t require us to suddenly devolve into a Game of Thrones bloodbath. For example, my colleague, Charlie Kurth, describes a progressive furlough approach that could help us weather this situation and emerge even stronger in our fundamental social justice values than before. But try sharing these more progressive, compassionate, egalitarian strategies with your university administrators. Their responses, or lack of them, may be the quickest way possible to learn what, deep down, this horrific spectacle we’re being required to enact is really all about.

Professors in the pandemic: The painful truth about how much universities actually value teaching and learning

At the university where I work, the directives and decisions trickling from on high are dire and draconian. Even the best budget forecasts present a grim scenario. We must all sacrifice. The viability of our institution, and of higher education itself, depend on our ability to make anguishing choices now. I do not doubt the urgency of current circumstances, but when I talk to faculty colleagues at my university and across the nation, we’re asking the same question as always: When it comes time to hack and saw at university budgets, why do so many institutions fail so utterly to prioritize academics?

Because the academic function of higher education has faced amputations for years, faculty are now perfectly primed to ask: Why do supposedly non-essential extras — including unprofitable, wildly expensive Division I sports programs — seem always to rise higher on the safe list than the instructors, advisors, and support staff that make teaching and learning possible? University responses to the pandemic, including cuts to instructional staff, rub salt into a long festering wound as, once again, athletic programs and administrative excess are mostly left off the table.

It should hardly be surprising that, in a nation that has long nursed anti-intellectual resentments, the academic portion of universities has been portrayed as the real drain on university budgets. After all, conservative extremism has managed to vilify public school teachers while celebrating greedy billionaires, so it’s hardly a challenge to scapegoat supposedly whiny, entitled professors. When times get tough, then, it has become quite natural for university administrations to penalize those closest to the academic mission. Of course, in addition to being steeped in the same anti-intellectual miasma that has gripped much of the nation for decades, administrations often face extraordinary pressure from football-loving conservative governing boards to “trim the fat.”

OLYMPUS DIGITAL CAMERA

Amid all the apparently self-evident calls for sacrifice, how easy it is to forget that university science training and labs make it possible to study and treat disease. And that it is years of university study that has permitted us to model and predict epidemics, to properly use ventilators, manage critical supply chains, to respond rationally to economic crisis, and to rebuild urban and rural infrastructure. So too, our research and teaching help our society refine its understanding of social and political evils, for example, white nationalism, environmental racism, structural inequality, and the like. In addition, focused work in creative fields has expanded human sensitivity and imagination, helping us to envision innovative futures and to honestly and courageously face the human condition, in both its beauty and horror.

While many will applaud this laundry list of why universities matter, when it comes time for sacrifice, where will the knife actually fall? To quote a wise old friend: “The boyfriend who tells you he loves you, but treats you like an afterthought or burden, doesn’t love you.” As devastating as this pandemic is, then, it’s also an opportunity to revisit questions about core university values and priorities. And when we examine our institutions, let’s bypass their high flown mission statements and elaborate strategic plans. Let’s ignore the pretty rhetoric of chancellors, presidents, provosts, and deans altogether. This is a terrible time in many respects, but it is the very, very best time to discover how much we’re actually worth to well-paid administrators who have been serenading us for years with assurances of how much we, and our departments, matter.

I do not think there is a single faculty member, advisor, or librarian who expects to be exempted from the consequences of this crisis. But we are also keenly aware of who has been marked as safe, and the order of those being pushed down the gang plank. Under cover of urgency, universities will, no doubt, succeed to some degree at fulfilling longstanding budgetary wishlists, e.g., reducing “academic bloat” through reorganization and elimination schemes they’ve fantasized about for years. Whatever happens next, though, may we never forget that we are seeing the truth that lies beyond the rhetoric. Each time you drive by your university’s two-million-dollar football scoreboard, remember that bad boyfriend, the one who insisted you were his sun and moon but could never manage to remember your birthday.

Plunging into Online Teaching: It’s not what I thought it would be

The first time I taught online was over a decade ago when I got pulled in like a tug of war contestant into a mud pit. A mid-career philosophy professor, I was a good teacher, a popular teacher, content with my pedagogical approach and buoyed by the energy of the face-to-face classroom.

I approached the challenge of online teaching like a translation problem: how to interpret my existing course into a virtual one. Back then there weren’t many online education resources to save me from this error, but even if there had been, I doubt I would have paid much attention. My real weakness was that I didn’t fully get that my classroom teaching represented a particular modality, one with its own accidental logic and underlying values. I couldn’t fundamentally rethink my strategy — lecture, discuss, exam, repeat — because it all seemed too basic and fundamental to deeply question. It’s no surprise, then, that this first foray into the virtual classroom was less than successful. I left with my ego bruised, feeling bad for my students, and resentful that I’d been nudged into participating.

OLYMPUS DIGITAL CAMERA

Fast forward and I am now deeply immersed in online teaching. Instead of fighting the waves, and tightening my grip on long-standing pedagogical habits and commitments, I am beginning to relax into the unfamiliarity of it. I can accept, at least sometimes, that this is not merely a shadow version of being a “real professor,” but, rather, a fundamentally different enterprise. I had been like the traveler unable to appreciate new vistas until she recognizes the biases she carries with her. I couldn’t see what online teaching had to offer until I could view my traditional teaching values and practices from a distance. At some point, I began to recognize my habitual way of teaching as involving particular, and changeable, assumptions, values and strategies. I still hold onto some of my traditional ways, and there are others whose loss I will probably always mourn. But for all of that, I am moving forward.

img_0186

I won’t sugarcoat this. My experiences with online teaching and my feelings about it are complicated. But the project of engaging with it is one that has transformed not just my teaching, but also my relationship to change itself. In ways I painstakingly explore in this blog, I am not only a better online teacher than I used to be, but I think I’m a better teacher period. Certainly, I am less ego-focused, less change-averse, and less nostalgic than I used to be. While I’m not an uncritical cheerleader for online education — I still rail against its worst tendencies — I have warmed to it enough so that it is working for me and my students. And even if I never taught another online class, I would still be enriched from having looked back on my pedagogical values and commitments from the shore of this new virtual land.

Mission critical thinking: Preparing students and ourselves for catastrophic times

Most liberal arts professors have known for years that the greatest good we can do for many of our students probably isn’t to immerse them in the advanced esoterica of our particular disciplines but to help develop their critical reading, writing and thinking skills. In the disastrous age of MAGA, I have begun to more fully appreciate this lesson: Part of my job is to help prepare students to locate and respond to catastrophic social, political, and ethical problems, only some of which we are now even able to fully imagine.

“Critical thinking,” that darling term we educators have been kissing and cuddling for decades, no longer cuts it when we face the full horror and possibility of what we are collectively facing. In past decades, “critical” has signaled ways of thinking, reading, and writing that occur from a questioning and investigative mode, a disinterested evaluation of facts, logical relationships between claims, and the biases of all concerned, including oneself. This is all to the good, especially the importance of challenging claims that happen to suit one’s preexisting expectations or preferences. Certainly, we would all be much better off if “critical thinking” of this sort could dislodge the irrational mob-think and craven consumerist claptrap that passes for much of current social and political discourse.

Teaching critical analysis as a fairly narrowly cognitive approach is evidently not enough, though. What we need is a reclamation of “critical” that is bolder, more dramatic, and far more socially and emotionally urgent than any we may have ever before used. In short, we must train students and ourselves to function as intellectual and psychological EMTs, prepared to move into the disaster zone with the skills, judgment, and nerve necessary for both triage and long-term, sustainable healing and repair. We need proactive, brave, pliable first responders who are also long-term strategic solution-seekers capable of evaluating and rearrange the big picture. The “critical thinking” values that must underlie our teaching work today are “critical” in the sense of “mission critical” and of “critical condition.” The symbol for this might include a pen and inkwell, but also a blood red armband and a sturdy multi-tool.

OLYMPUS DIGITAL CAMERA

This more urgent, red-alert version of critical thinking obviously must include much of what has always mattered about this traditional skillset, including close reading, basic logic, the analysis of evidence, and evaluation of perspective. But it must place greater explicit emphasis on qualities of individual motivation, self-care and character development, including the cultivation of:
– a healthy combination of confidence, humility, self-efficacy and self-reflection
– an unwavering commitment to empathy and compassion that does not slide into paternalistic pity or overwhelmed quietism
– a bias toward positive, productive action in the service of deep communal values, including for example, participatory democracy and racial equality
– an ability to make tough, real-world decisions in the face of incomplete information and general uncertainty
– the courage to go against the grain, to swim upstream from groupthink while still respecting the legitimate needs of the community

Even this cursory, general list serves as a cautionary guide for me: As a feminist philosopher, I have for decades emphasized a cognitively based, moderate notion of critical thinking that has reflected both a (perhaps naive) confidence in human reason and a (legitimate) concern about alienating students. I have, then, often ended up focusing on tweaking reading, writing and thinking skills, careful not to be “too normative” or “too directive” with respect to the social and emotional values surrounding these supposedly “neutral” cognitive standards. I haven’t avoided real world issues — this would not even possible in the courses I teach — but I have sometimes highlighted the intellectual “toolbox” aspect of critical thinking in order to sidestep the messier social and ethical facets that give cognitive values sense and power.

IMG_3516

For better and worse, I know that I am not the only instructor who has been dancing carefully among the demanding arms of cognitive, emotional, social and ethical competence. Unfortunately, there is extraordinary pressure on professors to treat students like desperately needed, precious, fickle, customers. Further, the long, determined march from tenured to contingent faculty has eroded the secure ground from which some faculty can be expected to engage in difficult dialogues. It is surely no accident that the academic freedom necessary to engage in authentically holistic critical thinking has been hacked away by conservative extremists at the very time it is most urgently needed. Regardless, we can no longer afford any semblance of the fantasy that liberal arts professors are debate coaches meant to lead students through “what if” puzzles to achieve oblique insights or incrementally improved logical skills. The most privileged of professors, at least, surely, might rethink our relationship to “critical thinking.”

So, though I still push my students to wrestle constructively, directly and intellectually with texts — this humanistic work matters! — I engage with them in ever more practical, particular, personal, and socially urgent terms. And I am more prepared than ever to acknowledge my astonishing ignorance, because, like so many well trained, smart professors, I have been caught off guard by the scale and doggedness of the retrograde cruelty and naked greed of conservative extremists. And so I commit as much to the pedagogical power of empathy, ethical sensitivity, and self-empowerment as to more specifically cognitive values. This isn’t a self-esteem based pedagogical gimmick, but, instead, a matter of necessity: It will take the empowered, compassionate, creative strategizing of all of us — young and old — to MacGyver our way out of this mess.

Super Mario in a one-room schoolhouse: The myth of a singular college experience

I have mastered my shield and sword become familiar with the labyrinth. More confident than ever, I sneak up behind an ogre, weapon drawn. But in the split second before I strike, the creature steps backward, knocking me into a chasm I’d taken great care to sidestep. The fizzling, “game over” music that accompanies my death mocks me. I have been hacked, zapped, and crushed to death, and, each time, I have tried again, determined to complete this sequence. This time, though, I save and quit, eager to play something easier. But five minutes into the “relaxing” tedium of a new game in which I scoop up gems while summarily dispatching lethargic foes, I have had it. I have gone from feeling demoralized by the challenges of the first game to annoyed by the childish ease of the second.

My fickle petulance in the face of such shifting levels of challenge invites me to think about the critical role that “appropriate difficulty” has in creating satisfyingly rich learning experiences in general. Of course, successful video game designers have mastered the nuances of manipulating obstacles, rewards and pacing to create engaging challenges. They know how to offer guidance that does not devolve into handholding, and small, consistent rewards along the way such as new weapons or abilities. In short, they create a world in which patient hard work will be rewarded.Though they may sometimes be very difficult, these challenges still feel ultimately fair. Because conscientious video game designers must so closely consider individual user engagement, they can provide key insights for instructors and students of all sorts. How many of us have stewed in the frustration of classes that felt rudimentary and plodding? And haven’t we also been left floundering in our own stupidity by courses pitched too far over our heads?

As a professor at an increasingly open access, mid-tier public university, calibrating difficulty is a task I find more daunting each year. While my strongest students’ level of preparation seems to be about the same as always, the college-readiness of everyone else is more and more of a mixed bag. My introductory classes are a motley blend of motivated readers, writers, and problem solvers combined with folks who lack basic skills, resources, and persistence. In recent years I have even begun thinking of myself as a plucky teacher in a one-room rural schoolhouse, charged with simultaneously facilitating grades K-12. I must stoke the fire and help the young’uns learn their letters while still ensuring that the older kids are pushing through their geometry problems. In short, I must be sensitive to individual ability and opportunity but in a fairly uniform environment.

It’s a principle that seems to underlie successful video game design as well in that they are typically aimed at cultivating individual interests and abilities, focusing on self-paced success and exploration. Games with mass appeal create a single world in which noobs can progress in their dawdling way while hard core gamers leap along, experiencing facets of play of which novices might never even become aware. In short, it is the layers of possibilities for individuals — of both reward and frustration — that allow one and the same gaming experience to be appropriately challenging and satisfying to a wide range of players. Such game design is possible only because no one is pretending that players will, should, or could leave with the same “results” or rewards; certainly, the success of the game does not depend on all players gleaning the same “benefits.”

By contrast, the notion persists that college classrooms can and should aim for the same reproducible outcome for each student, though this goal has perhaps never been more elusive at non-selective publics. And, though, of course it has always been the case that individual learners’ outcomes vary wildly, universities have also continued to prioritize assessment methods that treat our classes functionally and our students as interchangeable variables. The professor’s success continues, by and large, to be measured by the degree to which she impacts students across a narrow set of uniform assessment goals/outcomes despite the fact that professors at open access publics are increasingly being called upon to facilitate one-room schoolhouses.

Instead of continuing to pretend that there is one definition of college-readiness and a singular college experience, we would be better off acknowledging that, by and large, many of our college classes are, at best, like Super Mario Odyssey, a game that attracts and entertains a remarkable gamut of players, from small children, to bored subway commuters, to deadly serious gamers. A casual player with sluggish reflexes might while away many satisfying hours, exploring here, butt stomping there, but unlocking only a tiny fraction of the game’s secrets and leaving many of its rewards unclaimed. In a way, it may not even make sense to say that the noob and the skilled gamer are playing the “same game” though they are operating in the same facilitated virtual space.

To be sure, I am appalled that our public education system has been so stratified along economic class lines for so long that is a simple fact that lots of students arrive at college not at all what we like to call “college ready.” But even as we fight for saner, more egalitarian K-12 public education policies, we must deal with the astonishing mix of abilities, motivations, and resources streaming into our college classrooms. After all, our universities have a pretty good idea what these students’ capabilities are and have accepted their tuition payments, invited them in, and made lots of promises. Rather than wringing our hands over the impossibility of teaching across such a broad range of ability, maybe we can imagine new ways for Mario to progress, whether he bounds, rolls or crawls. The reality is that, whether I like it or not, I have been charged with lighting the wood stove, clapping the erasers, and preparing to die again and again and again.

Maybe it’s healthy to be ambivalent about online education

As I grow older, I’m better able to accept that living well requires making choices between imperfect alternatives. This more pragmatic orientation also feels more mature — think of a toddler who refuses any treat that falls short of ideal — and it also helps me appreciate how I’ve misused ambivalence in the past. As valuable and unavoidable as some ambivalence is, I now see that some of what I’d attributed to admirable, intellectually honest uncertainty probably had more to do with fear.

Of course there are different kinds of ambivalence and some matter more than others. For example, because I’m merely a coffee addict and not a connoisseur, when offered the choice between light or dark roast, I usually say “whichever’s freshest.” I’ve learned to say this rather than admit I don’t care because a bald expression of ambivalence can paralyze the cafe staff. Because they know and care about coffee, such naked ambivalence must seem irresponsible or disingenuous. “How can you not care?” they must be thinking.

img_0570

Ambivalence like this is pretty trivial unless the choice is thought to be expressive or constitutive of one’s identity, i.e., “I’m the kind of person who only wears black.” This is a kind of lifestyle identity politics that’s based on allying oneself with this kind of music, or clothing style, or football team rather than that one. When identity is, implicitly or explicitly, thought to be at issue then too much ambivalence can seem like a wishy-washy abdication of one’s very self.

Before I uneasily embraced online education, I was swirling in ambivalence that I couldn’t fully articulate. I was, in fact, more likely to voice my really substantive (ethical, political, social) misgivings about it than my more mundane concerns. In retrospect, though, I see that my merely practical worries drove my aversion to online teaching at least as much as my deeper misgivings: Would I be overwhelmed by the amount of work? Was I too set in my ways to master the technology? How would I meaningfully connect with students without the crutch of my charismatic schtick?

OLYMPUS DIGITAL CAMERA

My ambivalence about the substantive issues hasn’t really changed: I am still as deeply troubled by how online education enables an increasingly corporatist higher ed even as it provides invaluable access for some students. I still hate that I am contributing to a more impersonal, interchangeably modular, version of education, even as I am proud of my new efforts to engage with students in this flexible, open-ended virtual space.

My ambivalence is genuine and important, and I live with the tension of it as I more or less happily go about my online work. It is a low grade discomfort that informs my choices and practices but which does not disable me. Clearly, I did not need to wait until I had moved past my ambivalence to embrace online teaching, but nor did I need to pretend that those mixed feelings had been resolved. In fact, I think my ethical discomfort is healthy and points to problems within higher ed, a system with failings that, though I am implicated in them, also need to be reckoned with. It would be a disservice to my integrity and to my vocation if I were to paint my criticisms pink and become a mere cheerleader for online education.

On the other hand, I wonder where I would be headed had I remained aloof from online ed out of respect for my supposedly noble ambivalence. I am reminded of a former senior colleague who, in the early days of email, proudly refused to use it. He had all sorts of important, and probably legitimate, gripes: It was too impersonal, too ambiguous, too informal, and so on. But it was evident that his aversion was also rooted in his fear of being unable to master this new game, and being an anti-email crank came to define him. I’ve always hoped that his righteous confidence turned out to be warm company, because as email continued its inexorable march, he became increasingly isolated from his students and colleagues.

Gamification: Seductive gold stars and pats on the back

In the third grade, I was rewarded for being the fastest to complete a series of long division problems on the blackboard. My prize, a Flintstone’s eraser, wasn’t even a good likeness of Dino, but I carried it with me for weeks. These days the reward I crave is the happy jingle from my iPad when I’ve completed the daily New York Times crossword. My awareness that I’m only sort of joking when I admit it’s my favorite song helps explain my ambivalence at incorporating similarly trivial rewards into my own classes. Frankly, it’s a little embarrassing to be so eager for such superficial affirmations.

Gamification, using elements of reward and friendly competition to encourage effort and engagement, is both simple and intuitively appealing. That it effectively lights fires — at least in some learners — is clear enough. Nudged onward by the promise of leveling up or of earning a virtual ribbon, we do sometimes perform more diligently and enthusiastically with these dangling carrots in sight. And so I created a badge icon for students who improve their quiz scores, one that automatically pops up on these users’ home pages. I plan to add consistency and perseverance badges as I seek more ways to exploit these easily implemented gamification strategies.


I’ve become willing to experiment with such cheap tactics partly because of my own recent experience as an online student; I was surprised by the tiny thrills of satisfaction I came to anticipate as my badges appeared. And I suspect that gamification has a similarly primal effect, not only on millennial video gamers, but on many of us who earned prizes as children: for the number of books read, a class spelling bee, or a math club competition. But I also know that some experts caution against linking worthwhile activities to crass rewards, noting that, for example, children may no longer color for sheer enjoyment when prizes become part of the mix. While this consequence might not be so worrisome for straightforwardly “outcome-based” courses, it would be anathema for teachers intent on cultivating joyfully authentic life-practices such as close reading and thoughtful discussion.

So, even as I create the release conditions for my virtual badges, imagining my students’ pleasure at receiving them, I’m a little sheepish. Is this all just a tawdry gimmick? Am I trying to bribe these precious human companions with trivial ego boosts, coaxing them to learn material that, as it happens, actually has both intrinsic value and relevance to their lives? Am I reinforcing a consumerist, credentialist view of learning as merely extrinsically valuable, with grades and prizes to be collected in exchange for a diploma and job? They are urgent questions for me because I’ve never meant for my students merely, or even primarily, to learn “information” or discrete “skill sets” associated with my “content area.”

As I continue to explore using badges and other rewards, I remind myself that what I’m up to — leveraging behaviorist elements of learning without sacrificing the ethos of learning for its own sake — is a very old pedagogical conundrum. It certainly didn’t arise with online teaching, even if online modalities have made us more self-conscious about the perils and promises of gamification. In online classes, the affinity of gamification to electronic gaming becomes obvious. And, of course, we all know, or imagine we do, how addictive and empty that activity can be. But, again, some of my most enduring memories as an elementary school student in the 70’s, long before Super Mario or Minecraft, also involved “gamification.” And they are memories that, for better and worse, still bring me vibrations of shame and satisfaction.

As a child, I was motivated by the promise and fear of prizes awarded and withheld, but this probably also compromised my ability to take learning risks because I did not want to be a loser. Gamification, then, is complicated and fraught, and it occurs to me that I should use it more thoughtfully. What if, for example, I invited students to explicitly reflect upon their own perceived susceptibility or aversion to gold stars and pats on the back? Could gamification then become a tool for deeper self-reflection and whole-person development? After all, much of life occurs against a competitive backdrop, a humming swirl of conditional, often arbitrary, ego affirmations and insults. A little more awareness of what’s driving the quest for that promotion, that house, or that anti-wrinkle cream is probably not such a bad idea.

Claiming the right to make beauty: Inspiration, motivation, and basic worthiness

Like lots of the kids around me in my humble Midwestern elementary school, I started playing a band instrument just because. Because the instruments were shiny and mysterious and because it meant being singled out as special three days a week to converge in the lunchroom for a cacophonous 45 minutes. I chose the trumpet because it seemed a magnificent luxury, like something from Cinderella, and because my brother had started playing one a few years before, so I figured my parents had to say yes to me too.

Just to be perfectly clear, I chose neither band nor this particular instrument because I loved music or the sound of brass. In fact, all the way through high school, I continued to plug diffidently away at the trumpet as if it were any other task, like making my bed or mowing the lawn. At no point — neither in practice at home nor public concerts— do I ever recall being moved by the actual experience of making music. Instead, I played out of habit and because it was something I’d agreed to do, giving it just enough time and energy to avoid totally embarrassing myself.

IMG_4705

I ponder this now, because here in the throes of middle age, I have picked up the trumpet once again. It’s a used student model, very much like the one I had decades ago, cold heavy brass that is both strange and familiar in my adult hands. The scent of valve oil and the chill circle of the mouthpiece against my (still) slightly crooked front teeth propel me backwards in time, reminding me that I am both the same and different from the kid who once ran the chromatic scale with such habitual mediocrity.

Shockingly, after just a few months, I find that, in one important sense, I’m already playing better than I ever did as a distracted kid. Adult-me, it seems, is motivated by an actual desire to make actual music. Though I rarely have an audience, I find myself making an effort to play with heart, drawn to the promise of making beauty with my mouth, breath and hands. The irony is that, having fully embraced the low stakes amateurism of playing the trumpet late in life, I am actually getting good at it, at least by my admittedly low standards. And I know this is because playing has become more about creating meaning than about merely mastering a skill set in order to operate a shiny machine.

My childhood failure to connect to the music-making aspect of playing the trumpet was, no doubt, due partly to a relative lack of cultural or artistic appreciation in my working class home. Like most of the kids around me, I grew up almost completely incapable of taking my creative potential seriously. It pretty much never occurred to me that I might be able to make beautiful music or art, because I simply could not fathom being special or worthy enough to approach these rarified realms. Journalism? Maybe. Poetry? Never. Why open myself to ridicule, then, by exerting steady and sincere effort to achieve something so impossibly far out of reach?

IMG_0983

I am left now with an incisive pedagogical lesson that I suspect most everyone else already knew: In many subject areas, especially those associated however obliquely with high culture, U.S. working class kids may never make it out of the starting gate. After all, admission price for even the bare possibility of genuine learning is a basic sense of one’s own belonging in the grand humanistic scheme of things. And how can those who cannot take themselves seriously as potential cultural creators ever embrace the requisite vulnerability? We must feel sure enough that we belong to throw ourselves into it again and again, failing spectacularly, without being overwhelmed by imposter syndrome or falling into what Tara Brach calls the “trance of unworthiness.”

In short, it’s pretty clear that great pedagogical potential is unleashed when we plug into our own sense of cultural self worth. Though the energy that flows from such cultivated aesthetic self-regard may be no more magical or mysterious than electricity, it can be just as transformative. It can mean the difference between a lifetime of stepping self-consciously and disjointedly from one note to another and one spent making bonafide music. Permission to take oneself seriously as a human creator, then, can nudge the sidelined outsider into the heart of the ballroom, into the chaotic dance with the muses that has long nourished the human soul.