Pandemic 2020: Let the university hunger games begin!

Does anyone wax longer or louder about respect, transparency, diversity, and equality than university presidents, provosts, and deans? For decades, at commencements, convocations, retirement ceremonies, and ribbon cuttings, we have been serenaded by one misty-eyed official after another reminding us of the unutterably precious value of our unique voices. These are not just pretty words, we have long been assured, but values rooted deeply in the shared governance structures that underlie our universities in the form of faculty senates, collective bargaining units, and enough faculty committees to make our heads spin. Our universities, with their enlightened and compassionate leaders, their egalitarian and rational decision-making processes, are oases in the midst of the nation’s MAGA barbarity, right? Sure, we have our ethical challenges, but no one can question the basic decency of our institutions, can they? No wonder it has been a shock for many of us that the moment times got really tough, some of our universities set out to stage their very own hunger games.

The premise is simple enough: A powerful, centralized oligarchy forces subjects to “volunteer” for an elaborate killing game intended both to solidify dependence and obedience, and to entertain the elites. Not only are subjects compelled to send their children into these orchestrated killing fields year after year, but they are expected to do so willingly, to dress up, smile, and join in the festivities surrounding the games. They are required not only to surrender their lives, then, but their own consciences and voices of protest as well. As deadly as the games are, their larger purpose has more to do with killing peoples’ spirits than their bodies. Though I have read lots of dystopian novels, I was especially moved by this aspect of The Hunger Games when I finally got around to reading it a few months ago. I could not shake the image of otherwise proud people coerced by artificially induced scarcity into killing one another while pampered elites looked on, sipping champagne and placing bets on who would be left standing at the end.

I was primed by my reading of The Hunger Games, then, to pay special attention when my institution, Western Michigan University, began listing and picking off its “non-essential” employees just a few weeks into the pandemic crisis, the first of many devastating personnel decisions that have emerged since. Hundreds of “expendable” employees have now been laid off and hundreds more have been told to expect our marching orders in the coming weeks, according to lists that have already been compiled and are being scrutinized by other inner-circle administrators behind tightly closed doors. Carefully choreographed, stylized messaging from presidents, provosts, and deans insists that this is all necessary for the good of the whole, and that we must do our duty and somberly accept these edicts. After all, these decisions have not been easy. In fact, they have kept the president up at night and been heartbreaking for the deans. Can’t we see the terrible position they are in, under extraordinary pressure from even higher ups, huddled in their private chambers, compiling human elimination lists to be shared with us when they’ve decided it’s the right time for us to know?

As with the hunger games of fiction, the damage here isn’t only to people’s lives and livelihoods, but to their hearts and minds. We, the remaining subjects of this newly authoritarian realm, are expected not just to live with whatever decisions spew forth from our “leaders,” but to get on board. In the spirit of shared sacrifice, we are expected to return as cheerleaders for our university in the Fall once the bodies of our faculty and staff colleagues have been cleared away. After all, didn’t the president and deans themselves accept voluntary pay cuts of five or ten percent? Well, no, those symbolically small cuts haven’t actually gone into effect yet but they will in a few months. You know, probably. Meanwhile, like the traumatized subjects of the eleven districts outside the pampered capitol city, we remaining university faculty and staff whisper among ourselves, knowing we should speak up, but terrified that it might be our own head next on the chopping block.

For example, though I belong to one of the most “protected” employee groups on campus, I assume that the letters of concern I sent recently to administrators have placed my career in even greater danger. After all, their decisions to not even acknowledge my messages were surely not intended to reassure me that my voice is still needed at this university, if, in fact, it ever was. And though I know, as we all do, that these administrators are, themselves, being pressed by even higher level “bosses,” this does not erase their basic ethical responsibility to me and the other faculty and staff entrusted to their stewardship. Partly because so many professors routinely remind our students that “just following orders” is a poor excuse, we have a hard time buying this when it comes from our intelligent, remarkably well-compensated, administrators.

It isn’t just those who have drunk the Kool-Aid who are now apologists for these clearcutting sprees by administrators desperate to meet budget targets based on rationales from higher up so obscure that even they themselves may barely understand them. As is nearly always the case with systematic injustice, elite administrators must leverage longstanding inequities between employees to meet their goals. At universities, there is often a sort of petty bourgeoisie of middle managers who help rationalize elite excess and soften resistance from below. Such complicity and accommodationism is critical because it helps obscure the fact that there is no actual necessity to the cruelty unfolding on our campuses. Our very real budget crises don’t require us to suddenly devolve into a Game of Thrones bloodbath. For example, my colleague, Charlie Kurth, describes a progressive furlough approach that could help us weather this situation and emerge even stronger in our fundamental social justice values than before. But try sharing these more progressive, compassionate, egalitarian strategies with your university administrators. Their responses, or lack of them, may be the quickest way possible to learn what, deep down, this horrific spectacle we’re being required to enact is really all about.

Professors in the pandemic: The painful truth about how much universities actually value teaching and learning

At the university where I work, the directives and decisions trickling from on high are dire and draconian. Even the best budget forecasts present a grim scenario. We must all sacrifice. The viability of our institution, and of higher education itself, depend on our ability to make anguishing choices now. I do not doubt the urgency of current circumstances, but when I talk to faculty colleagues at my university and across the nation, we’re asking the same question as always: When it comes time to hack and saw at university budgets, why do so many institutions fail so utterly to prioritize academics?

Because the academic function of higher education has faced amputations for years, faculty are now perfectly primed to ask: Why do supposedly non-essential extras — including unprofitable, wildly expensive Division I sports programs — seem always to rise higher on the safe list than the instructors, advisors, and support staff that make teaching and learning possible? University responses to the pandemic, including cuts to instructional staff, rub salt into a long festering wound as, once again, athletic programs and administrative excess are mostly left off the table.

It should hardly be surprising that, in a nation that has long nursed anti-intellectual resentments, the academic portion of universities has been portrayed as the real drain on university budgets. After all, conservative extremism has managed to vilify public school teachers while celebrating greedy billionaires, so it’s hardly a challenge to scapegoat supposedly whiny, entitled professors. When times get tough, then, it has become quite natural for university administrations to penalize those closest to the academic mission. Of course, in addition to being steeped in the same anti-intellectual miasma that has gripped much of the nation for decades, administrations often face extraordinary pressure from football-loving conservative governing boards to “trim the fat.”

OLYMPUS DIGITAL CAMERA

Amid all the apparently self-evident calls for sacrifice, how easy it is to forget that university science training and labs make it possible to study and treat disease. And that it is years of university study that has permitted us to model and predict epidemics, to properly use ventilators, manage critical supply chains, to respond rationally to economic crisis, and to rebuild urban and rural infrastructure. So too, our research and teaching help our society refine its understanding of social and political evils, for example, white nationalism, environmental racism, structural inequality, and the like. In addition, focused work in creative fields has expanded human sensitivity and imagination, helping us to envision innovative futures and to honestly and courageously face the human condition, in both its beauty and horror.

While many will applaud this laundry list of why universities matter, when it comes time for sacrifice, where will the knife actually fall? To quote a wise old friend: “The boyfriend who tells you he loves you, but treats you like an afterthought or burden, doesn’t love you.” As devastating as this pandemic is, then, it’s also an opportunity to revisit questions about core university values and priorities. And when we examine our institutions, let’s bypass their high flown mission statements and elaborate strategic plans. Let’s ignore the pretty rhetoric of chancellors, presidents, provosts, and deans altogether. This is a terrible time in many respects, but it is the very, very best time to discover how much we’re actually worth to well-paid administrators who have been serenading us for years with assurances of how much we, and our departments, matter.

I do not think there is a single faculty member, advisor, or librarian who expects to be exempted from the consequences of this crisis. But we are also keenly aware of who has been marked as safe, and the order of those being pushed down the gang plank. Under cover of urgency, universities will, no doubt, succeed to some degree at fulfilling longstanding budgetary wishlists, e.g., reducing “academic bloat” through reorganization and elimination schemes they’ve fantasized about for years. Whatever happens next, though, may we never forget that we are seeing the truth that lies beyond the rhetoric. Each time you drive by your university’s two-million-dollar football scoreboard, remember that bad boyfriend, the one who insisted you were his sun and moon but could never manage to remember your birthday.

Professors in the pandemic: Getting intimate with our fears about online education

When I originally began The Virtual Pedagogue some years ago it was to explore my own ambivalence about teaching online. Though the circumstances were far less dramatic than the crisis we now face, my initial experience as an online teacher fifteen or so years ago was also rushed and born of necessity. Predictably, it left such a bad taste in my mouth that it wasn’t til many years later that I felt any inclination to dip my toes in those waters again. Happily, my more recent experiences were far more positive and, over the past five years, I’ve taught many of my courses online while also reflecting on my experience in papers like this, in workshops with colleagues, and here on The Virtual Pedagogue. With most instruction now being pushed online, this seems like a good time to reconsider issues I’ve been ruminating about for a while from my limited perspective as a tenured, mid-career liberal arts faculty member. Not surprisingly, most of my concerns have turned out to be reducible to fear, in one form or another, which does not, of course, make them any less legitimate.

The first fear is systemic. In fact, it is huge. It is that, in agreeing to teach online, we are participating in a fast-food model of education that enables crass corporatism and hastens the demise of our brick and mortar institutions. As I discuss in many places here on the VP, there is, undeniably, cause for concern, but I see it less as a function of the technological shift than of the extreme inequality shaping higher education in the U.S.. To be sure, online education must not become the default modality for the poor while privileged students and faculty at elite institutions continue to hold debates in lovely ivory towers. The challenge is real and entrenched given that, for many vulnerable students, who may have multiple jobs, mental or physical disability, and child or elder care responsibilities, online classes are the only feasible access point to college. Though it may be tempting to identify online education as the culprit, then, the real enemy is even more daunting: structural barriers that fundamentally limit the options that students have about the kind of educational experience they will have.

IMG_0635

Especially for more senior faculty members like me, online educational technology itself can also be intimidating, especially given the proliferation of auxiliary bells and whistles that we may feel pressured to include in our classes. Many of us know what it’s like to have been brought to our knees by a computer program at some point — be it Quickbooks, Photoshop, or our university’s online advising system — and we may have little inclination to seek out more such demoralizing experiences. This may be especially true with respect to teaching which, for some of us, may be the one arena in which we feel utterly competent.

It is undoubtedly true that poorly utilized online technology can be clunky and unwieldy, serving to distract more than to enable learning. But if one focuses on the basics — and what this means will vary a lot from discipline to discipline — it is no more intrinsically difficult than other programs or apps that most of us routinely use, for example, while we shop, communicate with long-distance grandchildren, or download audiobooks from our public library. And though some learning discomfort is unavoidable, anyone who still refuses to engage with online technology at all — even to supplement their courses — is, at this point, more like that telephone-averse butler on Downton Abbey than a hero fighting for traditional education. As time and technology march inexorably onward, at some point one becomes less of a lovable curmudgeon and more of a cranky Luddite.

IMG_4916

Perhaps the most insidious fear, and the one I explore most frequently here on The Virtual Pedagogue, is the threat that online teaching can represent to our deepest identities as competent, respected, valued professionals. Though it’s not something we professors usually like to admit, there can be tremendous ego satisfaction in traditional face-to-face classroom performance. After all, we have been assigned the featured role in a pedagogical drama, one that many of us have, over decades, honed to perfection. It is no wonder that many of us have come to relish and rely upon the adoring faces of students as they bask in our brilliance.

How often, when we extol the “fire,” “energy,” and “magic” of the classroom, might we actually be referring to the ego satisfaction that we ourselves derive from students’ attention and praise? I think this is not necessarily because we are shallow or narcissistic, but, rather, a perhaps inevitable consequence of engaging in this sort of intensely human labor. For many instructors, the physical university, with its hallowed halls and ivory towers, is a beloved backdrop that allows us to enact hard-won, lovingly cultivated identities that seem to require the nurturing attention of students. The loss of that sea of shining faces can feel like an erasure of our professorial identity altogether, as though we have been replaced by a mere machine.

OLYMPUS DIGITAL CAMERA

While there are, of course, lots of good reasons for prioritizing face-to-face education — I will never write a love letter to online only institutions — it is critically important to get deeply honest, especially with ourselves, about what, precisely, our fears and misgivings are about online education. This is especially urgent now that, for most of us, online teaching has suddenly become an unavoidable reality rather than a mere pedagogical possibility or abstraction. To be sure, some of our complaints about online education may turn out to be intrinsic weaknesses of the online modality itself, but some, surely, are based on other fears and anxieties.

How much of our discomfort about online education is really about our anger, fear and sorrow over economic injustice, anti-intellectualism, public disinvestment in higher education, and the radical communication shifts that have fundamentally reshaped human relationships and institutions? Whatever happens next in the development of universities’ relationship to online education — and this is a train that left the station long ago — faculty must be in the driver’s seat. But we cannot guide this process wisely and effectively if we are not relentlessly honest with ourselves about where our fears and misgivings about it lie.

Below are links to a few of the many posts on this site that explore questions about online education:
Are online classes the fast food of higher ed?

Are online teachers lazy sellouts?

Is anybody out there? The loneliness of the online teacher

Telling the truth about online education

The sweet ego boost of teaching face-to-face

Plunging into online teaching: It’s not what I thought it would be

Online teaching: The joy of tedious planning

Could online teaching be a path to enlightenment?

 

 

Plunging into Online Teaching: It’s not what I thought it would be

The first time I taught online was over a decade ago when I got pulled in like a tug of war contestant into a mud pit. A mid-career philosophy professor, I was a good teacher, a popular teacher, content with my pedagogical approach and buoyed by the energy of the face-to-face classroom.

I approached the challenge of online teaching like a translation problem: how to interpret my existing course into a virtual one. Back then there weren’t many online education resources to save me from this error, but even if there had been, I doubt I would have paid much attention. My real weakness was that I didn’t fully get that my classroom teaching represented a particular modality, one with its own accidental logic and underlying values. I couldn’t fundamentally rethink my strategy — lecture, discuss, exam, repeat — because it all seemed too basic and fundamental to deeply question. It’s no surprise, then, that this first foray into the virtual classroom was less than successful. I left with my ego bruised, feeling bad for my students, and resentful that I’d been nudged into participating.

OLYMPUS DIGITAL CAMERA

Fast forward and I am now deeply immersed in online teaching. Instead of fighting the waves, and tightening my grip on long-standing pedagogical habits and commitments, I am beginning to relax into the unfamiliarity of it. I can accept, at least sometimes, that this is not merely a shadow version of being a “real professor,” but, rather, a fundamentally different enterprise. I had been like the traveler unable to appreciate new vistas until she recognizes the biases she carries with her. I couldn’t see what online teaching had to offer until I could view my traditional teaching values and practices from a distance. At some point, I began to recognize my habitual way of teaching as involving particular, and changeable, assumptions, values and strategies. I still hold onto some of my traditional ways, and there are others whose loss I will probably always mourn. But for all of that, I am moving forward.

img_0186

I won’t sugarcoat this. My experiences with online teaching and my feelings about it are complicated. But the project of engaging with it is one that has transformed not just my teaching, but also my relationship to change itself. In ways I painstakingly explore in this blog, I am not only a better online teacher than I used to be, but I think I’m a better teacher period. Certainly, I am less ego-focused, less change-averse, and less nostalgic than I used to be. While I’m not an uncritical cheerleader for online education — I still rail against its worst tendencies — I have warmed to it enough so that it is working for me and my students. And even if I never taught another online class, I would still be enriched from having looked back on my pedagogical values and commitments from the shore of this new virtual land.

Mission critical thinking: Preparing students and ourselves for catastrophic times

Most liberal arts professors have known for years that the greatest good we can do for many of our students probably isn’t to immerse them in the advanced esoterica of our particular disciplines but to help develop their critical reading, writing and thinking skills. In the disastrous age of MAGA, I have begun to more fully appreciate this lesson: Part of my job is to help prepare students to locate and respond to catastrophic social, political, and ethical problems, only some of which we are now even able to fully imagine.

“Critical thinking,” that darling term we educators have been kissing and cuddling for decades, no longer cuts it when we face the full horror and possibility of what we are collectively facing. In past decades, “critical” has signaled ways of thinking, reading, and writing that occur from a questioning and investigative mode, a disinterested evaluation of facts, logical relationships between claims, and the biases of all concerned, including oneself. This is all to the good, especially the importance of challenging claims that happen to suit one’s preexisting expectations or preferences. Certainly, we would all be much better off if “critical thinking” of this sort could dislodge the irrational mob-think and craven consumerist claptrap that passes for much of current social and political discourse.

Teaching critical analysis as a fairly narrowly cognitive approach is evidently not enough, though. What we need is a reclamation of “critical” that is bolder, more dramatic, and far more socially and emotionally urgent than any we may have ever before used. In short, we must train students and ourselves to function as intellectual and psychological EMTs, prepared to move into the disaster zone with the skills, judgment, and nerve necessary for both triage and long-term, sustainable healing and repair. We need proactive, brave, pliable first responders who are also long-term strategic solution-seekers capable of evaluating and rearrange the big picture. The “critical thinking” values that must underlie our teaching work today are “critical” in the sense of “mission critical” and of “critical condition.” The symbol for this might include a pen and inkwell, but also a blood red armband and a sturdy multi-tool.

OLYMPUS DIGITAL CAMERA

This more urgent, red-alert version of critical thinking obviously must include much of what has always mattered about this traditional skillset, including close reading, basic logic, the analysis of evidence, and evaluation of perspective. But it must place greater explicit emphasis on qualities of individual motivation, self-care and character development, including the cultivation of:
– a healthy combination of confidence, humility, self-efficacy and self-reflection
– an unwavering commitment to empathy and compassion that does not slide into paternalistic pity or overwhelmed quietism
– a bias toward positive, productive action in the service of deep communal values, including for example, participatory democracy and racial equality
– an ability to make tough, real-world decisions in the face of incomplete information and general uncertainty
– the courage to go against the grain, to swim upstream from groupthink while still respecting the legitimate needs of the community

Even this cursory, general list serves as a cautionary guide for me: As a feminist philosopher, I have for decades emphasized a cognitively based, moderate notion of critical thinking that has reflected both a (perhaps naive) confidence in human reason and a (legitimate) concern about alienating students. I have, then, often ended up focusing on tweaking reading, writing and thinking skills, careful not to be “too normative” or “too directive” with respect to the social and emotional values surrounding these supposedly “neutral” cognitive standards. I haven’t avoided real world issues — this would not even possible in the courses I teach — but I have sometimes highlighted the intellectual “toolbox” aspect of critical thinking in order to sidestep the messier social and ethical facets that give cognitive values sense and power.

IMG_3516

For better and worse, I know that I am not the only instructor who has been dancing carefully among the demanding arms of cognitive, emotional, social and ethical competence. Unfortunately, there is extraordinary pressure on professors to treat students like desperately needed, precious, fickle, customers. Further, the long, determined march from tenured to contingent faculty has eroded the secure ground from which some faculty can be expected to engage in difficult dialogues. It is surely no accident that the academic freedom necessary to engage in authentically holistic critical thinking has been hacked away by conservative extremists at the very time it is most urgently needed. Regardless, we can no longer afford any semblance of the fantasy that liberal arts professors are debate coaches meant to lead students through “what if” puzzles to achieve oblique insights or incrementally improved logical skills. The most privileged of professors, at least, surely, might rethink our relationship to “critical thinking.”

So, though I still push my students to wrestle constructively, directly and intellectually with texts — this humanistic work matters! — I engage with them in ever more practical, particular, personal, and socially urgent terms. And I am more prepared than ever to acknowledge my astonishing ignorance, because, like so many well trained, smart professors, I have been caught off guard by the scale and doggedness of the retrograde cruelty and naked greed of conservative extremists. And so I commit as much to the pedagogical power of empathy, ethical sensitivity, and self-empowerment as to more specifically cognitive values. This isn’t a self-esteem based pedagogical gimmick, but, instead, a matter of necessity: It will take the empowered, compassionate, creative strategizing of all of us — young and old — to MacGyver our way out of this mess.

Can we learn something from our excuses for not meditating?

Partly because I sometimes write and teach about Buddhism and mindfulness, people are inclined to tell me about their experiments with meditation. And it almost always begins with “I’m really bad at it” or, “My mind just won’t stop,” or, “I tried but I just can’t sit still.” Almost always they volunteer rationalizations that feature guilt, and also imply that they themselves are almost uniquely unsuited to the practice because they are so freakishly impatient and busy headed.

And while they may be claiming to be especially bad at meditation, it’s still an assertion of specialness, and one that may have special appeal for academics. Many professors, after all, adore thinking, and so being bad at meditation can become a kind of boast, proof of one’s insatiable tendency to critically assess. It’s a rationalization, then, that can help shore up one’s mundane, ego-based identity story — a self-understanding that includes personality and profession — the very tale that a consistent meditation practice might eventually lead one to scrutinize.

To be fair, we Western academics also operate in a broader societal context that encourages and prizes constant busyness and endless mental chatter. It will probably surprise no one, then, that Buddhist meditation was long described by Western critics as a form of escapism for lazy quietists. In a capitalist, rationalist milieu that places a premium on constant mental and physical “productivity,” what can it mean to be a faithful meditator except that one is content to sit on one’s ass and zone out? To supply reasons why one doesn’t meditate, then, may function both as a quintessentially intellectualist badge of honor and an implicit endorsement of American capitalist virtues.

Although I disagree (of course) with the tired, colonialist caricatures of Buddhism, I’m not here to sell meditation either. If fact, outside of classrooms explicitly featuring the topic, it’s something I hardly ever discuss. I find that sitting meditation supports my own sense of peace, efficacy, and well being. But partly as a result of meditation, I’ve become unwilling to assert that this is true for others. I notice, though, that many non-meditators themselves describe meditation as something they should be doing, making excuses for avoiding it stand out in sharper relief. What does it mean to offer rationalizations for not doing something that no one is monitoring and that one has no obligation to do? Our relationship to meditation, perhaps especially when we put energy into describing how we avoid it, turns out to be kind of interesting.

Could it be that the real action lies less in meditation itself than in learning to hear the stories we volunteer about why we do or don’t do this or that? After all, if there is a point to meditation, it is probably the promise of increased awareness that leads to greater peace, equanimity and self-knowledge. On this score, it is perhaps more important to become cognizant of the rationalizations we use to fortify our habitual identities — including that of being a “non-meditator” — than to meditate for the sake of being a good meditator. Paradoxically, though, meditation may well be the most efficient path for learning to actually hear the endless verbal storms that ravage our minds and often pour unbidden from our mouths, including, perhaps, the excuses we make for why we don’t meditate.

Super Mario in a one-room schoolhouse: The myth of a singular college experience

I have mastered my shield and sword become familiar with the labyrinth. More confident than ever, I sneak up behind an ogre, weapon drawn. But in the split second before I strike, the creature steps backward, knocking me into a chasm I’d taken great care to sidestep. The fizzling, “game over” music that accompanies my death mocks me. I have been hacked, zapped, and crushed to death, and, each time, I have tried again, determined to complete this sequence. This time, though, I save and quit, eager to play something easier. But five minutes into the “relaxing” tedium of a new game in which I scoop up gems while summarily dispatching lethargic foes, I have had it. I have gone from feeling demoralized by the challenges of the first game to annoyed by the childish ease of the second.

My fickle petulance in the face of such shifting levels of challenge invites me to think about the critical role that “appropriate difficulty” has in creating satisfyingly rich learning experiences in general. Of course, successful video game designers have mastered the nuances of manipulating obstacles, rewards and pacing to create engaging challenges. They know how to offer guidance that does not devolve into handholding, and small, consistent rewards along the way such as new weapons or abilities. In short, they create a world in which patient hard work will be rewarded.Though they may sometimes be very difficult, these challenges still feel ultimately fair. Because conscientious video game designers must so closely consider individual user engagement, they can provide key insights for instructors and students of all sorts. How many of us have stewed in the frustration of classes that felt rudimentary and plodding? And haven’t we also been left floundering in our own stupidity by courses pitched too far over our heads?

As a professor at an increasingly open access, mid-tier public university, calibrating difficulty is a task I find more daunting each year. While my strongest students’ level of preparation seems to be about the same as always, the college-readiness of everyone else is more and more of a mixed bag. My introductory classes are a motley blend of motivated readers, writers, and problem solvers combined with folks who lack basic skills, resources, and persistence. In recent years I have even begun thinking of myself as a plucky teacher in a one-room rural schoolhouse, charged with simultaneously facilitating grades K-12. I must stoke the fire and help the young’uns learn their letters while still ensuring that the older kids are pushing through their geometry problems. In short, I must be sensitive to individual ability and opportunity but in a fairly uniform environment.

It’s a principle that seems to underlie successful video game design as well in that they are typically aimed at cultivating individual interests and abilities, focusing on self-paced success and exploration. Games with mass appeal create a single world in which noobs can progress in their dawdling way while hard core gamers leap along, experiencing facets of play of which novices might never even become aware. In short, it is the layers of possibilities for individuals — of both reward and frustration — that allow one and the same gaming experience to be appropriately challenging and satisfying to a wide range of players. Such game design is possible only because no one is pretending that players will, should, or could leave with the same “results” or rewards; certainly, the success of the game does not depend on all players gleaning the same “benefits.”

By contrast, the notion persists that college classrooms can and should aim for the same reproducible outcome for each student, though this goal has perhaps never been more elusive at non-selective publics. And, though, of course it has always been the case that individual learners’ outcomes vary wildly, universities have also continued to prioritize assessment methods that treat our classes functionally and our students as interchangeable variables. The professor’s success continues, by and large, to be measured by the degree to which she impacts students across a narrow set of uniform assessment goals/outcomes despite the fact that professors at open access publics are increasingly being called upon to facilitate one-room schoolhouses.

Instead of continuing to pretend that there is one definition of college-readiness and a singular college experience, we would be better off acknowledging that, by and large, many of our college classes are, at best, like Super Mario Odyssey, a game that attracts and entertains a remarkable gamut of players, from small children, to bored subway commuters, to deadly serious gamers. A casual player with sluggish reflexes might while away many satisfying hours, exploring here, butt stomping there, but unlocking only a tiny fraction of the game’s secrets and leaving many of its rewards unclaimed. In a way, it may not even make sense to say that the noob and the skilled gamer are playing the “same game” though they are operating in the same facilitated virtual space.

To be sure, I am appalled that our public education system has been so stratified along economic class lines for so long that is a simple fact that lots of students arrive at college not at all what we like to call “college ready.” But even as we fight for saner, more egalitarian K-12 public education policies, we must deal with the astonishing mix of abilities, motivations, and resources streaming into our college classrooms. After all, our universities have a pretty good idea what these students’ capabilities are and have accepted their tuition payments, invited them in, and made lots of promises. Rather than wringing our hands over the impossibility of teaching across such a broad range of ability, maybe we can imagine new ways for Mario to progress, whether he bounds, rolls or crawls. The reality is that, whether I like it or not, I have been charged with lighting the wood stove, clapping the erasers, and preparing to die again and again and again.

Maybe it’s healthy to be ambivalent about online education

As I grow older, I’m better able to accept that living well requires making choices between imperfect alternatives. This more pragmatic orientation also feels more mature — think of a toddler who refuses any treat that falls short of ideal — and it also helps me appreciate how I’ve misused ambivalence in the past. As valuable and unavoidable as some ambivalence is, I now see that some of what I’d attributed to admirable, intellectually honest uncertainty probably had more to do with fear.

Of course there are different kinds of ambivalence and some matter more than others. For example, because I’m merely a coffee addict and not a connoisseur, when offered the choice between light or dark roast, I usually say “whichever’s freshest.” I’ve learned to say this rather than admit I don’t care because a bald expression of ambivalence can paralyze the cafe staff. Because they know and care about coffee, such naked ambivalence must seem irresponsible or disingenuous. “How can you not care?” they must be thinking.

img_0570

Ambivalence like this is pretty trivial unless the choice is thought to be expressive or constitutive of one’s identity, i.e., “I’m the kind of person who only wears black.” This is a kind of lifestyle identity politics that’s based on allying oneself with this kind of music, or clothing style, or football team rather than that one. When identity is, implicitly or explicitly, thought to be at issue then too much ambivalence can seem like a wishy-washy abdication of one’s very self.

Before I uneasily embraced online education, I was swirling in ambivalence that I couldn’t fully articulate. I was, in fact, more likely to voice my really substantive (ethical, political, social) misgivings about it than my more mundane concerns. In retrospect, though, I see that my merely practical worries drove my aversion to online teaching at least as much as my deeper misgivings: Would I be overwhelmed by the amount of work? Was I too set in my ways to master the technology? How would I meaningfully connect with students without the crutch of my charismatic schtick?

OLYMPUS DIGITAL CAMERA

My ambivalence about the substantive issues hasn’t really changed: I am still as deeply troubled by how online education enables an increasingly corporatist higher ed even as it provides invaluable access for some students. I still hate that I am contributing to a more impersonal, interchangeably modular, version of education, even as I am proud of my new efforts to engage with students in this flexible, open-ended virtual space.

My ambivalence is genuine and important, and I live with the tension of it as I more or less happily go about my online work. It is a low grade discomfort that informs my choices and practices but which does not disable me. Clearly, I did not need to wait until I had moved past my ambivalence to embrace online teaching, but nor did I need to pretend that those mixed feelings had been resolved. In fact, I think my ethical discomfort is healthy and points to problems within higher ed, a system with failings that, though I am implicated in them, also need to be reckoned with. It would be a disservice to my integrity and to my vocation if I were to paint my criticisms pink and become a mere cheerleader for online education.

On the other hand, I wonder where I would be headed had I remained aloof from online ed out of respect for my supposedly noble ambivalence. I am reminded of a former senior colleague who, in the early days of email, proudly refused to use it. He had all sorts of important, and probably legitimate, gripes: It was too impersonal, too ambiguous, too informal, and so on. But it was evident that his aversion was also rooted in his fear of being unable to master this new game, and being an anti-email crank came to define him. I’ve always hoped that his righteous confidence turned out to be warm company, because as email continued its inexorable march, he became increasingly isolated from his students and colleagues.

Gamification: Seductive gold stars and pats on the back

In the third grade, I was rewarded for being the fastest to complete a series of long division problems on the blackboard. My prize, a Flintstone’s eraser, wasn’t even a good likeness of Dino, but I carried it with me for weeks. These days the reward I crave is the happy jingle from my iPad when I’ve completed the daily New York Times crossword. My awareness that I’m only sort of joking when I admit it’s my favorite song helps explain my ambivalence at incorporating similarly trivial rewards into my own classes. Frankly, it’s a little embarrassing to be so eager for such superficial affirmations.

Gamification, using elements of reward and friendly competition to encourage effort and engagement, is both simple and intuitively appealing. That it effectively lights fires — at least in some learners — is clear enough. Nudged onward by the promise of leveling up or of earning a virtual ribbon, we do sometimes perform more diligently and enthusiastically with these dangling carrots in sight. And so I created a badge icon for students who improve their quiz scores, one that automatically pops up on these users’ home pages. I plan to add consistency and perseverance badges as I seek more ways to exploit these easily implemented gamification strategies.


I’ve become willing to experiment with such cheap tactics partly because of my own recent experience as an online student; I was surprised by the tiny thrills of satisfaction I came to anticipate as my badges appeared. And I suspect that gamification has a similarly primal effect, not only on millennial video gamers, but on many of us who earned prizes as children: for the number of books read, a class spelling bee, or a math club competition. But I also know that some experts caution against linking worthwhile activities to crass rewards, noting that, for example, children may no longer color for sheer enjoyment when prizes become part of the mix. While this consequence might not be so worrisome for straightforwardly “outcome-based” courses, it would be anathema for teachers intent on cultivating joyfully authentic life-practices such as close reading and thoughtful discussion.

So, even as I create the release conditions for my virtual badges, imagining my students’ pleasure at receiving them, I’m a little sheepish. Is this all just a tawdry gimmick? Am I trying to bribe these precious human companions with trivial ego boosts, coaxing them to learn material that, as it happens, actually has both intrinsic value and relevance to their lives? Am I reinforcing a consumerist, credentialist view of learning as merely extrinsically valuable, with grades and prizes to be collected in exchange for a diploma and job? They are urgent questions for me because I’ve never meant for my students merely, or even primarily, to learn “information” or discrete “skill sets” associated with my “content area.”

As I continue to explore using badges and other rewards, I remind myself that what I’m up to — leveraging behaviorist elements of learning without sacrificing the ethos of learning for its own sake — is a very old pedagogical conundrum. It certainly didn’t arise with online teaching, even if online modalities have made us more self-conscious about the perils and promises of gamification. In online classes, the affinity of gamification to electronic gaming becomes obvious. And, of course, we all know, or imagine we do, how addictive and empty that activity can be. But, again, some of my most enduring memories as an elementary school student in the 70’s, long before Super Mario or Minecraft, also involved “gamification.” And they are memories that, for better and worse, still bring me vibrations of shame and satisfaction.

As a child, I was motivated by the promise and fear of prizes awarded and withheld, but this probably also compromised my ability to take learning risks because I did not want to be a loser. Gamification, then, is complicated and fraught, and it occurs to me that I should use it more thoughtfully. What if, for example, I invited students to explicitly reflect upon their own perceived susceptibility or aversion to gold stars and pats on the back? Could gamification then become a tool for deeper self-reflection and whole-person development? After all, much of life occurs against a competitive backdrop, a humming swirl of conditional, often arbitrary, ego affirmations and insults. A little more awareness of what’s driving the quest for that promotion, that house, or that anti-wrinkle cream is probably not such a bad idea.

Claiming the right to make beauty: Inspiration, motivation, and basic worthiness

Like lots of the kids around me in my humble Midwestern elementary school, I started playing a band instrument just because. Because the instruments were shiny and mysterious and because it meant being singled out as special three days a week to converge in the lunchroom for a cacophonous 45 minutes. I chose the trumpet because it seemed a magnificent luxury, like something from Cinderella, and because my brother had started playing one a few years before, so I figured my parents had to say yes to me too.

Just to be perfectly clear, I chose neither band nor this particular instrument because I loved music or the sound of brass. In fact, all the way through high school, I continued to plug diffidently away at the trumpet as if it were any other task, like making my bed or mowing the lawn. At no point — neither in practice at home nor public concerts— do I ever recall being moved by the actual experience of making music. Instead, I played out of habit and because it was something I’d agreed to do, giving it just enough time and energy to avoid totally embarrassing myself.

IMG_4705

I ponder this now, because here in the throes of middle age, I have picked up the trumpet once again. It’s a used student model, very much like the one I had decades ago, cold heavy brass that is both strange and familiar in my adult hands. The scent of valve oil and the chill circle of the mouthpiece against my (still) slightly crooked front teeth propel me backwards in time, reminding me that I am both the same and different from the kid who once ran the chromatic scale with such habitual mediocrity.

Shockingly, after just a few months, I find that, in one important sense, I’m already playing better than I ever did as a distracted kid. Adult-me, it seems, is motivated by an actual desire to make actual music. Though I rarely have an audience, I find myself making an effort to play with heart, drawn to the promise of making beauty with my mouth, breath and hands. The irony is that, having fully embraced the low stakes amateurism of playing the trumpet late in life, I am actually getting good at it, at least by my admittedly low standards. And I know this is because playing has become more about creating meaning than about merely mastering a skill set in order to operate a shiny machine.

My childhood failure to connect to the music-making aspect of playing the trumpet was, no doubt, due partly to a relative lack of cultural or artistic appreciation in my working class home. Like most of the kids around me, I grew up almost completely incapable of taking my creative potential seriously. It pretty much never occurred to me that I might be able to make beautiful music or art, because I simply could not fathom being special or worthy enough to approach these rarified realms. Journalism? Maybe. Poetry? Never. Why open myself to ridicule, then, by exerting steady and sincere effort to achieve something so impossibly far out of reach?

IMG_0983

I am left now with an incisive pedagogical lesson that I suspect most everyone else already knew: In many subject areas, especially those associated however obliquely with high culture, U.S. working class kids may never make it out of the starting gate. After all, admission price for even the bare possibility of genuine learning is a basic sense of one’s own belonging in the grand humanistic scheme of things. And how can those who cannot take themselves seriously as potential cultural creators ever embrace the requisite vulnerability? We must feel sure enough that we belong to throw ourselves into it again and again, failing spectacularly, without being overwhelmed by imposter syndrome or falling into what Tara Brach calls the “trance of unworthiness.”

In short, it’s pretty clear that great pedagogical potential is unleashed when we plug into our own sense of cultural self worth. Though the energy that flows from such cultivated aesthetic self-regard may be no more magical or mysterious than electricity, it can be just as transformative. It can mean the difference between a lifetime of stepping self-consciously and disjointedly from one note to another and one spent making bonafide music. Permission to take oneself seriously as a human creator, then, can nudge the sidelined outsider into the heart of the ballroom, into the chaotic dance with the muses that has long nourished the human soul.