Pandemic 2020: Are universities treating the disease but killing the patient?

The virus in our midst is especially deadly for those with existing underlying health risks, though most of us who develop symptoms will eventually emerge relatively unscathed. So too, though nearly all U.S. colleges and universities are being touched by the pandemic, only some have already died or are languishing on life support. And while financial ruin may be what actually kills some, for far too many others, the true cause of death will be their failure to respond ethically and sustainably to the crisis rather than the crisis itself. Predictably, universities built on a genuine foundation of equity, respect, and sustainability are likely to survive and, eventually, to thrive, while those infused with hierarchy, secrecy, and reactivity began to teeter and crumble almost the very day that stay-at-home orders went into effect.

To take one example, my university has been at the national forefront with respect to cutting personnel and planning radical restructuring schemes, changes that may permanently reshape the university’s instructional and research capability, despite the unknown long term impact of the crisis. It was as if, one day, an abstract budget target number appeared in the sky, like the bat signal, with provosts and deans rushing to create and implement slash and burn plans that have included little or no substantive input from faculty, staff, or students. Whether my institution, Western Michigan University, is actually financially worse off than universities that have taken a more measured, holistic, stakeholder-based approach, I cannot say. As is (not incidentally) the case with many large, hierarchically run organizations, budgets are often so complicated and opaque that it can be impossible to separate fact from fiction, accuracy from exaggeration.

OLYMPUS DIGITAL CAMERA

Though the actual underlying rational for the budget targets may be murky, the impact on our campus community has been clear: early layoffs of hundreds, with, we are told, many more to come soon. In addition, we are seeing the early signs of marginalization, merger, or elimination of academic departments, according to factors that seem to have nothing to do with metrics of productivity or cost-benefit analyses. For example, my university has also been among the first in the nation to take steps to merge or eliminate its successful and cheap Gender and Women’s Studies Department (my tenure home) which focuses on LGBTQ youth, students of color, women, and sexual assault survivors. It is the sort of move that, though sadly unsurprising to those who do diversity work, has recently earned the condemnation of the National Women’s Studies Association for its apparent opportunism. NWSA chides universities “that are using the crisis to implement cuts they have been unable to make in the past because of faculty and student opposition and organizing.”

The true devastation of top-down ad hoc slash and burn policies isn’t primarily the intrinsic suffering they cause to campus constituents, especially to students, but the fact that this suffering is inflicted from above, for arbitrary or implausible reasons, and that it falls so disproportionately on the most vulnerable. Though, to be sure, elite administrators have agreed to symbolic cuts to their sometimes jaw-droppingly high salaries, their lives and livelihoods remain largely untouched. It is as if there had been a treaty signed by administrators from the outset that, whatever cuts and restructuring might occur, the university’s existing salary inequities, power structures, and habitual priorities must remain unchallenged. Rather than serving as an opportunity to reaffirm our commitments to compassion and equity, at too many organizations, this crisis is being leveraged to further erode such values.

OLYMPUS DIGITAL CAMERA

It is not higher education’s financial crisis alone that will ultimately determine which universities will live and which will languish and eventually die. Consider, for example, two equally resourced families of equal size, faced with the same dire economic news. The patriarchal head of the first family sighs regretfully and immediately drowns two of his children in the well. Problem solved! Fewer mouths to feed! The second family, though, sits down together to study the situation. Are these numbers accurate? Is there any way to challenge the apparent facts of the new reality? If not, are there parts of the family budget that can be trimmed to free up more money for necessities? Because this second family’s approach is values-based, participatory, and deliberate, they are better off even if, ultimately, their shared suffering is great. For one thing, their children, though skinnier, will probably still be alive to help harvest the fields when, one day, the corn grows again. But also, having remained true to their core values, the heart and soul of this family remain intact.

Those who rush to drown their children, or otherwise sell their souls to the devil to address a crisis — no matter how apparently grave — have already lost the war against this latest scourge. When the dust clears, we may celebrate the buildings and privileged employees still left standing, but we will also speculate about what transpired behind closed doors and secret meetings in exchange for such physical survival. In the long aftermath of catastrophe, we will ask: Who collaborated and who resisted? Who stood by and watched while others were pitched into the well? Such a climate of suspicion and resentment will stain our universities well into the future, undermining our will to inspire students or give our best to our disciplines or institutions. The problem universities are facing right now, then, is a financial one, to be sure, but it is just as much an ethical one. We are in the midst of a moral trial that invites us to peek behind the facade of classrooms, cafeterias and football stadiums to find out who we really are. And it does us not one bit of good, if, in a frenzy to keep the patient’s body alive, we kill the very thing that makes that life worth preserving in the first place.

Entitled and out of touch: The danger of anti-professor stereotypes in the pandemic

The stereotype of university professors as entitled babies who are oblivious to the “real world” takes on new urgency as the pandemic rages. Encouraged for decades by well funded conservative extremists, it’s become pretty standard for pundits and politicians to dismiss professors as spoiled, elitist, and selfish. Not surprisingly, it’s a stereotype that many university functionaries, including administrators, have accepted as well. Worse still, some professors have themselves come to internalize it, thereby discouraged from asking questions about anything “administrative,” including apparently hasty top-down decisions that may bypass our contracts or cripple our institutions’ academic viability.

For decades, then, professors have been getting the message that they are barely tolerated by many in the state capitol, and by variously titled chairs, deans, provosts, and presidents who, increasingly, assert their own managerial identities by differentiating themselves from us. Faculty members who are occasionally privy to administrative conversations often express surprise and distaste at the degree to which supposed faculty obliviousness and incompetence feature. It starts to seem as if many administrative-types don’t merely believe anti-faculty stereotypes but also bond with one another over them. There is perhaps no more effective way for rookie administrators to perform their new bureaucratic identity than to join in the familiar banter about impractical, coddled, and lazy faculty.

In the midst of higher education’s pandemic response, then, is it any wonder so many university administrations plow ahead with critical decisions, making little effort to substantively collaborate with faculty? After all, haven’t professors exempted themselves from the right to participate by virtue of being self-exiled prima donnas who care far more about their arcane research than balance sheets or the public good? Is it any wonder that even those of us who are the object of these stereotypes may still feel shamed and silenced by them? “Maybe it’s true,” we may think. “Perhaps a professor of English (or geography or music or mathematics) has no business speaking up given the life and death urgency of the moment.”

Except, of course, that the dismissal of professors’ voices is mostly based on an impressive pile of half-truth and hooey. Yes, some small percentage of U.S. professors come from elite backgrounds, land plum positions, and go on to live and work in “splendid isolation from the world.” In most cases, though, professors are actual flesh-and-blood people. Often, we have taken on staggering student loan debt and struggled for years, working as waitresses, census takers and retail clerks in the increasingly desperate hope of snagging tenure-track positions at humble regional universities in Pennsylvania or Ohio or Kentucky.

When we join these institutions, we are required to fully immerse ourselves in increasingly bureaucratic university service, provide individual attention to understandably beleaguered students, and research and publish in our areas of academic expertise, many of which are not arcane in the least. We spend our workdays teaching, lobbying for critical research equipment, making cold calls to prospective students, working through piles of accreditation forms, and writing tons of student recommendation letters. This, mind you, is if we are one of the lucky ones. For the majority of instructors, who are adjuncts or otherwise undervalued academic laborers, work demands and anxieties are usually far greater.

Only vanishingly few of us, then, ever catch a glimpse of anything resembling an ivory tower into which we might retreat with quill and parchment while kingdoms rise and fall around us. We are, rather, members of the communities in which we live, often small towns where we buy our groceries, fall in love, get mammograms, and send our children to school. We anguish along with our neighbors about gun violence, climate change, access to medical care, and the opportunistic fascism and viral pathogens sweeping through our nation.

Yes, the vast majority of instructors in higher education are privileged by race and class, a reflection of the unacceptable stratification that deforms all of U.S. culture and society, and not just higher education. Only when compared to the most shamelessly exploited members of society — especially the essential service workers now required to put their lives at risk for peanuts — do professors, as a whole class, appear to be an especially entitled, elite group. It is no accident that, with respect to pay, status, and the other factors that insulate a group from the pains of the world, professors are rarely compared by critics to CEOs, hedge fund managers, or even university administrators. Evidently, there is something especially appealing and effective about scapegoating professors and other educators for the hideous erosion of the American middle class.

It has long been clear that U.S. professors have been targeted for derision and elimination by conservative extremists. Just as evident is the fact that anti-professor stereotypes are rooted in the assumption that, while folks in private business, technology, medicine, entertainment, and sports might deserve some degree of prestige and pay, professors and K-12 teachers generally do not. This is in no small measure a result of concerted conservative efforts to exploit the longstanding American love affair with anti-intellectualism. In the U.S., it seems, it has never been especially difficult for unscrupulous plutocrats to funnel populist outrage toward books and those who love them.

But the tensions and exigencies of the pandemic make it ever clearer that it’s not just conservative extremists who use stereotypes to justify vilifying and marginalizing professors. It is also a growing cadre of professionalized university bureaucrats for whom professors’ supposed impracticality and pampered entitlement rationalize our exclusion from critical decision-making. At best the scenario that unfolds in one in which faculty are hapless children with wise and benevolent parents. At worst, we are self-centered nincompoops who must be flattered and manipulated into accepting policies that we have had no voice in creating. If, in the midst of crisis, we consent to such treatment — perhaps persuading ourselves that university administrators really do know best — will we ever again be allowed to sit at the big table with the grown ups?

Plunging into Online Teaching: It’s not what I thought it would be

The first time I taught online was over a decade ago when I got pulled in like a tug of war contestant into a mud pit. A mid-career philosophy professor, I was a good teacher, a popular teacher, content with my pedagogical approach and buoyed by the energy of the face-to-face classroom.

I approached the challenge of online teaching like a translation problem: how to interpret my existing course into a virtual one. Back then there weren’t many online education resources to save me from this error, but even if there had been, I doubt I would have paid much attention. My real weakness was that I didn’t fully get that my classroom teaching represented a particular modality, one with its own accidental logic and underlying values. I couldn’t fundamentally rethink my strategy — lecture, discuss, exam, repeat — because it all seemed too basic and fundamental to deeply question. It’s no surprise, then, that this first foray into the virtual classroom was less than successful. I left with my ego bruised, feeling bad for my students, and resentful that I’d been nudged into participating.

OLYMPUS DIGITAL CAMERA

Fast forward and I am now deeply immersed in online teaching. Instead of fighting the waves, and tightening my grip on long-standing pedagogical habits and commitments, I am beginning to relax into the unfamiliarity of it. I can accept, at least sometimes, that this is not merely a shadow version of being a “real professor,” but, rather, a fundamentally different enterprise. I had been like the traveler unable to appreciate new vistas until she recognizes the biases she carries with her. I couldn’t see what online teaching had to offer until I could view my traditional teaching values and practices from a distance. At some point, I began to recognize my habitual way of teaching as involving particular, and changeable, assumptions, values and strategies. I still hold onto some of my traditional ways, and there are others whose loss I will probably always mourn. But for all of that, I am moving forward.

img_0186

I won’t sugarcoat this. My experiences with online teaching and my feelings about it are complicated. But the project of engaging with it is one that has transformed not just my teaching, but also my relationship to change itself. In ways I painstakingly explore in this blog, I am not only a better online teacher than I used to be, but I think I’m a better teacher period. Certainly, I am less ego-focused, less change-averse, and less nostalgic than I used to be. While I’m not an uncritical cheerleader for online education — I still rail against its worst tendencies — I have warmed to it enough so that it is working for me and my students. And even if I never taught another online class, I would still be enriched from having looked back on my pedagogical values and commitments from the shore of this new virtual land.

Mission critical thinking: Preparing students and ourselves for catastrophic times

Most liberal arts professors have known for years that the greatest good we can do for many of our students probably isn’t to immerse them in the advanced esoterica of our particular disciplines but to help develop their critical reading, writing and thinking skills. In the disastrous age of MAGA, I have begun to more fully appreciate this lesson: Part of my job is to help prepare students to locate and respond to catastrophic social, political, and ethical problems, only some of which we are now even able to fully imagine.

“Critical thinking,” that darling term we educators have been kissing and cuddling for decades, no longer cuts it when we face the full horror and possibility of what we are collectively facing. In past decades, “critical” has signaled ways of thinking, reading, and writing that occur from a questioning and investigative mode, a disinterested evaluation of facts, logical relationships between claims, and the biases of all concerned, including oneself. This is all to the good, especially the importance of challenging claims that happen to suit one’s preexisting expectations or preferences. Certainly, we would all be much better off if “critical thinking” of this sort could dislodge the irrational mob-think and craven consumerist claptrap that passes for much of current social and political discourse.

Teaching critical analysis as a fairly narrowly cognitive approach is evidently not enough, though. What we need is a reclamation of “critical” that is bolder, more dramatic, and far more socially and emotionally urgent than any we may have ever before used. In short, we must train students and ourselves to function as intellectual and psychological EMTs, prepared to move into the disaster zone with the skills, judgment, and nerve necessary for both triage and long-term, sustainable healing and repair. We need proactive, brave, pliable first responders who are also long-term strategic solution-seekers capable of evaluating and rearrange the big picture. The “critical thinking” values that must underlie our teaching work today are “critical” in the sense of “mission critical” and of “critical condition.” The symbol for this might include a pen and inkwell, but also a blood red armband and a sturdy multi-tool.

OLYMPUS DIGITAL CAMERA

This more urgent, red-alert version of critical thinking obviously must include much of what has always mattered about this traditional skillset, including close reading, basic logic, the analysis of evidence, and evaluation of perspective. But it must place greater explicit emphasis on qualities of individual motivation, self-care and character development, including the cultivation of:
– a healthy combination of confidence, humility, self-efficacy and self-reflection
– an unwavering commitment to empathy and compassion that does not slide into paternalistic pity or overwhelmed quietism
– a bias toward positive, productive action in the service of deep communal values, including for example, participatory democracy and racial equality
– an ability to make tough, real-world decisions in the face of incomplete information and general uncertainty
– the courage to go against the grain, to swim upstream from groupthink while still respecting the legitimate needs of the community

Even this cursory, general list serves as a cautionary guide for me: As a feminist philosopher, I have for decades emphasized a cognitively based, moderate notion of critical thinking that has reflected both a (perhaps naive) confidence in human reason and a (legitimate) concern about alienating students. I have, then, often ended up focusing on tweaking reading, writing and thinking skills, careful not to be “too normative” or “too directive” with respect to the social and emotional values surrounding these supposedly “neutral” cognitive standards. I haven’t avoided real world issues — this would not even possible in the courses I teach — but I have sometimes highlighted the intellectual “toolbox” aspect of critical thinking in order to sidestep the messier social and ethical facets that give cognitive values sense and power.

IMG_3516

For better and worse, I know that I am not the only instructor who has been dancing carefully among the demanding arms of cognitive, emotional, social and ethical competence. Unfortunately, there is extraordinary pressure on professors to treat students like desperately needed, precious, fickle, customers. Further, the long, determined march from tenured to contingent faculty has eroded the secure ground from which some faculty can be expected to engage in difficult dialogues. It is surely no accident that the academic freedom necessary to engage in authentically holistic critical thinking has been hacked away by conservative extremists at the very time it is most urgently needed. Regardless, we can no longer afford any semblance of the fantasy that liberal arts professors are debate coaches meant to lead students through “what if” puzzles to achieve oblique insights or incrementally improved logical skills. The most privileged of professors, at least, surely, might rethink our relationship to “critical thinking.”

So, though I still push my students to wrestle constructively, directly and intellectually with texts — this humanistic work matters! — I engage with them in ever more practical, particular, personal, and socially urgent terms. And I am more prepared than ever to acknowledge my astonishing ignorance, because, like so many well trained, smart professors, I have been caught off guard by the scale and doggedness of the retrograde cruelty and naked greed of conservative extremists. And so I commit as much to the pedagogical power of empathy, ethical sensitivity, and self-empowerment as to more specifically cognitive values. This isn’t a self-esteem based pedagogical gimmick, but, instead, a matter of necessity: It will take the empowered, compassionate, creative strategizing of all of us — young and old — to MacGyver our way out of this mess.

Can we learn something from our excuses for not meditating?

Partly because I sometimes write and teach about Buddhism and mindfulness, people are inclined to tell me about their experiments with meditation. And it almost always begins with “I’m really bad at it” or, “My mind just won’t stop,” or, “I tried but I just can’t sit still.” Almost always they volunteer rationalizations that feature guilt, and also imply that they themselves are almost uniquely unsuited to the practice because they are so freakishly impatient and busy headed.

And while they may be claiming to be especially bad at meditation, it’s still an assertion of specialness, and one that may have special appeal for academics. Many professors, after all, adore thinking, and so being bad at meditation can become a kind of boast, proof of one’s insatiable tendency to critically assess. It’s a rationalization, then, that can help shore up one’s mundane, ego-based identity story — a self-understanding that includes personality and profession — the very tale that a consistent meditation practice might eventually lead one to scrutinize.

To be fair, we Western academics also operate in a broader societal context that encourages and prizes constant busyness and endless mental chatter. It will probably surprise no one, then, that Buddhist meditation was long described by Western critics as a form of escapism for lazy quietists. In a capitalist, rationalist milieu that places a premium on constant mental and physical “productivity,” what can it mean to be a faithful meditator except that one is content to sit on one’s ass and zone out? To supply reasons why one doesn’t meditate, then, may function both as a quintessentially intellectualist badge of honor and an implicit endorsement of American capitalist virtues.

Although I disagree (of course) with the tired, colonialist caricatures of Buddhism, I’m not here to sell meditation either. If fact, outside of classrooms explicitly featuring the topic, it’s something I hardly ever discuss. I find that sitting meditation supports my own sense of peace, efficacy, and well being. But partly as a result of meditation, I’ve become unwilling to assert that this is true for others. I notice, though, that many non-meditators themselves describe meditation as something they should be doing, making excuses for avoiding it stand out in sharper relief. What does it mean to offer rationalizations for not doing something that no one is monitoring and that one has no obligation to do? Our relationship to meditation, perhaps especially when we put energy into describing how we avoid it, turns out to be kind of interesting.

Could it be that the real action lies less in meditation itself than in learning to hear the stories we volunteer about why we do or don’t do this or that? After all, if there is a point to meditation, it is probably the promise of increased awareness that leads to greater peace, equanimity and self-knowledge. On this score, it is perhaps more important to become cognizant of the rationalizations we use to fortify our habitual identities — including that of being a “non-meditator” — than to meditate for the sake of being a good meditator. Paradoxically, though, meditation may well be the most efficient path for learning to actually hear the endless verbal storms that ravage our minds and often pour unbidden from our mouths, including, perhaps, the excuses we make for why we don’t meditate.

The Uses and Abuses of Ambivalence

As I grow older, I’m better able to accept that living well requires making choices between imperfect alternatives. This more pragmatic orientation also feels more mature — think of a toddler who refuses any treat that falls short of ideal — and it also helps me appreciate how I’ve misused ambivalence in the past. As valuable and unavoidable as some ambivalence is, I now see that some of what I’d attributed to admirable, intellectually honest uncertainty probably had more to do with fear.

Of course there are different kinds of ambivalence and some matter more than others. For example, because I’m merely a coffee addict and not a connoisseur, when offered the choice between light or dark roast, I usually say “whichever’s freshest.” I’ve learned to say this rather than admit I don’t care because a bald expression of ambivalence can paralyze the cafe staff. Because they know and care about coffee, such naked ambivalence must seem irresponsible or disingenuous. “How can you not care?” they must be thinking.

img_0570

Ambivalence like this is pretty trivial unless the choice is thought to be expressive or constitutive of one’s identity, i.e., “I’m the kind of person who only wears black.” This is a kind of lifestyle identity politics that’s based on allying oneself with this kind of music, or clothing style, or football team rather than that one. When identity is, implicitly or explicitly, thought to be at issue then too much ambivalence can seem like a wishy-washy abdication of one’s very self.

Before I uneasily embraced online education, I was swirling in ambivalence that I couldn’t fully articulate. I was, in fact, more likely to voice my really substantive (ethical, political, social) misgivings about it than my more mundane concerns. In retrospect, though, I see that my merely practical worries drove my aversion to online teaching at least as much as my deeper misgivings: Would I be overwhelmed by the amount of work? Was I too set in my ways to master the technology? How would I meaningfully connect with students without the crutch of my charismatic schtick?

OLYMPUS DIGITAL CAMERA

My ambivalence about the substantive issues hasn’t really changed: I am still as deeply troubled by how online education enables an increasingly corporatist higher ed even as it provides invaluable access for some students. I still hate that I am contributing to a more impersonal, interchangeably modular, version of education, even as I am proud of my new efforts to engage with students in this flexible, open-ended virtual space.

My ambivalence is genuine and important, and I live with the tension of it as I more or less happily go about my online work. It is a low grade discomfort that informs my choices and practices but which does not disable me. Clearly, I did not need to wait until I had moved past my ambivalence to embrace online teaching, but nor did I need to pretend that those mixed feelings had been resolved. In fact, I think my ethical discomfort is healthy and points to problems within higher ed, a system with failings that, though I am implicated in them, also need to be reckoned with. It would be a disservice to my integrity and to my vocation if I were to paint my criticisms pink and become a mere cheerleader for online education.

On the other hand, I wonder where I would be headed had I remained aloof from online ed out of respect for my supposedly noble ambivalence. I am reminded of a former senior colleague who, in the early days of email, proudly refused to use it. He had all sorts of important, and probably legitimate, gripes: It was too impersonal, too ambiguous, too informal, and so on. But it was evident that his aversion was also rooted in his fear of being unable to master this new game, and being an anti-email crank came to define him. I’ve always hoped that his righteous confidence turned out to be warm company, because as email continued its inexorable march, he became increasingly isolated from his students and colleagues.

Gamification: Seductive gold stars and pats on the back

In the third grade, I was rewarded for being the fastest to complete a series of long division problems on the blackboard. My prize, a Flintstone’s eraser, wasn’t even a good likeness of Dino, but I carried it with me for weeks. These days the reward I crave is the happy jingle from my iPad when I’ve completed the daily New York Times crossword. My awareness that I’m only sort of joking when I admit it’s my favorite song helps explain my ambivalence at incorporating similarly trivial rewards into my own classes. Frankly, it’s a little embarrassing to be so eager for such superficial affirmations.

Gamification, using elements of reward and friendly competition to encourage effort and engagement, is both simple and intuitively appealing. That it effectively lights fires — at least in some learners — is clear enough. Nudged onward by the promise of leveling up or of earning a virtual ribbon, we do sometimes perform more diligently and enthusiastically with these dangling carrots in sight. And so I created a badge icon for students who improve their quiz scores, one that automatically pops up on these users’ home pages. I plan to add consistency and perseverance badges as I seek more ways to exploit these easily implemented gamification strategies.


I’ve become willing to experiment with such cheap tactics partly because of my own recent experience as an online student; I was surprised by the tiny thrills of satisfaction I came to anticipate as my badges appeared. And I suspect that gamification has a similarly primal effect, not only on millennial video gamers, but on many of us who earned prizes as children: for the number of books read, a class spelling bee, or a math club competition. But I also know that some experts caution against linking worthwhile activities to crass rewards, noting that, for example, children may no longer color for sheer enjoyment when prizes become part of the mix. While this consequence might not be so worrisome for straightforwardly “outcome-based” courses, it would be anathema for teachers intent on cultivating joyfully authentic life-practices such as close reading and thoughtful discussion.

So, even as I create the release conditions for my virtual badges, imagining my students’ pleasure at receiving them, I’m a little sheepish. Is this all just a tawdry gimmick? Am I trying to bribe these precious human companions with trivial ego boosts, coaxing them to learn material that, as it happens, actually has both intrinsic value and relevance to their lives? Am I reinforcing a consumerist, credentialist view of learning as merely extrinsically valuable, with grades and prizes to be collected in exchange for a diploma and job? They are urgent questions for me because I’ve never meant for my students merely, or even primarily, to learn “information” or discrete “skill sets” associated with my “content area.”

As I continue to explore using badges and other rewards, I remind myself that what I’m up to — leveraging behaviorist elements of learning without sacrificing the ethos of learning for its own sake — is a very old pedagogical conundrum. It certainly didn’t arise with online teaching, even if online modalities have made us more self-conscious about the perils and promises of gamification. In online classes, the affinity of gamification to electronic gaming becomes obvious. And, of course, we all know, or imagine we do, how addictive and empty that activity can be. But, again, some of my most enduring memories as an elementary school student in the 70’s, long before Super Mario or Minecraft, also involved “gamification.” And they are memories that, for better and worse, still bring me vibrations of shame and satisfaction.

As a child, I was motivated by the promise and fear of prizes awarded and withheld, but this probably also compromised my ability to take learning risks because I did not want to be a loser. Gamification, then, is complicated and fraught, and it occurs to me that I should use it more thoughtfully. What if, for example, I invited students to explicitly reflect upon their own perceived susceptibility or aversion to gold stars and pats on the back? Could gamification then become a tool for deeper self-reflection and whole-person development? After all, much of life occurs against a competitive backdrop, a humming swirl of conditional, often arbitrary, ego affirmations and insults. A little more awareness of what’s driving the quest for that promotion, that house, or that anti-wrinkle cream is probably not such a bad idea.