Can we learn something from our excuses for not meditating?

Partly because I sometimes write and teach about Buddhism and mindfulness, people are inclined to tell me about their experiments with meditation. And it almost always begins with “I’m really bad at it” or, “My mind just won’t stop,” or, “I tried but I just can’t sit still.” Almost always they volunteer rationalizations that feature guilt, and also imply that they themselves are almost uniquely unsuited to the practice because they are so freakishly impatient and busy headed.

And while they may be claiming to be especially bad at meditation, it’s still an assertion of specialness, and one that may have special appeal for academics. Many professors, after all, adore thinking, and so being bad at meditation can become a kind of boast, proof of one’s insatiable tendency to critically assess. It’s a rationalization, then, that can help shore up one’s mundane, ego-based identity story — a self-understanding that includes personality and profession — the very tale that a consistent meditation practice might eventually lead one to scrutinize.

To be fair, we Western academics also operate in a broader societal context that encourages and prizes constant busyness and endless mental chatter. It will probably surprise no one, then, that Buddhist meditation was long described by Western critics as a form of escapism for lazy quietists. In a capitalist, rationalist milieu that places a premium on constant mental and physical “productivity,” what can it mean to be a faithful meditator except that one is content to sit on one’s ass and zone out? To supply reasons why one doesn’t meditate, then, may function both as a quintessentially intellectualist badge of honor and an implicit endorsement of American capitalist virtues.

Although I disagree (of course) with the tired, colonialist caricatures of Buddhism, I’m not here to sell meditation either. If fact, outside of classrooms explicitly featuring the topic, it’s something I hardly ever discuss. I find that sitting meditation supports my own sense of peace, efficacy, and well being. But partly as a result of meditation, I’ve become unwilling to assert that this is true for others. I notice, though, that many non-meditators themselves describe meditation as something they should be doing, making excuses for avoiding it stand out in sharper relief. What does it mean to offer rationalizations for not doing something that no one is monitoring and that one has no obligation to do? Our relationship to meditation, perhaps especially when we put energy into describing how we avoid it, turns out to be kind of interesting.

Could it be that the real action lies less in meditation itself than in learning to hear the stories we volunteer about why we do or don’t do this or that? After all, if there is a point to meditation, it is probably the promise of increased awareness that leads to greater peace, equanimity and self-knowledge. On this score, it is perhaps more important to become cognizant of the rationalizations we use to fortify our habitual identities — including that of being a “non-meditator” — than to meditate for the sake of being a good meditator. Paradoxically, though, meditation may well be the most efficient path for learning to actually hear the endless verbal storms that ravage our minds and often pour unbidden from our mouths, including, perhaps, the excuses we make for why we don’t meditate.

Super Mario in a one-room schoolhouse: The myth of a singular college experience

I have mastered my shield and sword become familiar with the labyrinth. More confident than ever, I sneak up behind an ogre, weapon drawn. But in the split second before I strike, the creature steps backward, knocking me into a chasm I’d taken great care to sidestep. The fizzling, “game over” music that accompanies my death mocks me. I have been hacked, zapped, and crushed to death, and, each time, I have tried again, determined to complete this sequence. This time, though, I save and quit, eager to play something easier. But five minutes into the “relaxing” tedium of a new game in which I scoop up gems while summarily dispatching lethargic foes, I have had it. I have gone from feeling demoralized by the challenges of the first game to annoyed by the childish ease of the second.

My fickle petulance in the face of such shifting levels of challenge invites me to think about the critical role that “appropriate difficulty” has in creating satisfyingly rich learning experiences in general. Of course, successful video game designers have mastered the nuances of manipulating obstacles, rewards and pacing to create engaging challenges. They know how to offer guidance that does not devolve into handholding, and small, consistent rewards along the way such as new weapons or abilities. In short, they create a world in which patient hard work will be rewarded.Though they may sometimes be very difficult, these challenges still feel ultimately fair. Because conscientious video game designers must so closely consider individual user engagement, they can provide key insights for instructors and students of all sorts. How many of us have stewed in the frustration of classes that felt rudimentary and plodding? And haven’t we also been left floundering in our own stupidity by courses pitched too far over our heads?

As a professor at an increasingly open access, mid-tier public university, calibrating difficulty is a task I find more daunting each year. While my strongest students’ level of preparation seems to be about the same as always, the college-readiness of everyone else is more and more of a mixed bag. My introductory classes are a motley blend of motivated readers, writers, and problem solvers combined with folks who lack basic skills, resources, and persistence. In recent years I have even begun thinking of myself as a plucky teacher in a one-room rural schoolhouse, charged with simultaneously facilitating grades K-12. I must stoke the fire and help the young’uns learn their letters while still ensuring that the older kids are pushing through their geometry problems. In short, I must be sensitive to individual ability and opportunity but in a fairly uniform environment.

It’s a principle that seems to underlie successful video game design as well in that they are typically aimed at cultivating individual interests and abilities, focusing on self-paced success and exploration. Games with mass appeal create a single world in which noobs can progress in their dawdling way while hard core gamers leap along, experiencing facets of play of which novices might never even become aware. In short, it is the layers of possibilities for individuals — of both reward and frustration — that allow one and the same gaming experience to be appropriately challenging and satisfying to a wide range of players. Such game design is possible only because no one is pretending that players will, should, or could leave with the same “results” or rewards; certainly, the success of the game does not depend on all players gleaning the same “benefits.”

By contrast, the notion persists that college classrooms can and should aim for the same reproducible outcome for each student, though this goal has perhaps never been more elusive at non-selective publics. And, though, of course it has always been the case that individual learners’ outcomes vary wildly, universities have also continued to prioritize assessment methods that treat our classes functionally and our students as interchangeable variables. The professor’s success continues, by and large, to be measured by the degree to which she impacts students across a narrow set of uniform assessment goals/outcomes despite the fact that professors at open access publics are increasingly being called upon to facilitate one-room schoolhouses.

Instead of continuing to pretend that there is one definition of college-readiness and a singular college experience, we would be better off acknowledging that, by and large, many of our college classes are, at best, like Super Mario Odyssey, a game that attracts and entertains a remarkable gamut of players, from small children, to bored subway commuters, to deadly serious gamers. A casual player with sluggish reflexes might while away many satisfying hours, exploring here, butt stomping there, but unlocking only a tiny fraction of the game’s secrets and leaving many of its rewards unclaimed. In a way, it may not even make sense to say that the noob and the skilled gamer are playing the “same game” though they are operating in the same facilitated virtual space.

To be sure, I am appalled that our public education system has been so stratified along economic class lines for so long that is a simple fact that lots of students arrive at college not at all what we like to call “college ready.” But even as we fight for saner, more egalitarian K-12 public education policies, we must deal with the astonishing mix of abilities, motivations, and resources streaming into our college classrooms. After all, our universities have a pretty good idea what these students’ capabilities are and have accepted their tuition payments, invited them in, and made lots of promises. Rather than wringing our hands over the impossibility of teaching across such a broad range of ability, maybe we can imagine new ways for Mario to progress, whether he bounds, rolls or crawls. The reality is that, whether I like it or not, I have been charged with lighting the wood stove, clapping the erasers, and preparing to die again and again and again.

Maybe it’s healthy to be ambivalent about online education

As I grow older, I’m better able to accept that living well requires making choices between imperfect alternatives. This more pragmatic orientation also feels more mature — think of a toddler who refuses any treat that falls short of ideal — and it also helps me appreciate how I’ve misused ambivalence in the past. As valuable and unavoidable as some ambivalence is, I now see that some of what I’d attributed to admirable, intellectually honest uncertainty probably had more to do with fear.

Of course there are different kinds of ambivalence and some matter more than others. For example, because I’m merely a coffee addict and not a connoisseur, when offered the choice between light or dark roast, I usually say “whichever’s freshest.” I’ve learned to say this rather than admit I don’t care because a bald expression of ambivalence can paralyze the cafe staff. Because they know and care about coffee, such naked ambivalence must seem irresponsible or disingenuous. “How can you not care?” they must be thinking.

img_0570

Ambivalence like this is pretty trivial unless the choice is thought to be expressive or constitutive of one’s identity, i.e., “I’m the kind of person who only wears black.” This is a kind of lifestyle identity politics that’s based on allying oneself with this kind of music, or clothing style, or football team rather than that one. When identity is, implicitly or explicitly, thought to be at issue then too much ambivalence can seem like a wishy-washy abdication of one’s very self.

Before I uneasily embraced online education, I was swirling in ambivalence that I couldn’t fully articulate. I was, in fact, more likely to voice my really substantive (ethical, political, social) misgivings about it than my more mundane concerns. In retrospect, though, I see that my merely practical worries drove my aversion to online teaching at least as much as my deeper misgivings: Would I be overwhelmed by the amount of work? Was I too set in my ways to master the technology? How would I meaningfully connect with students without the crutch of my charismatic schtick?

OLYMPUS DIGITAL CAMERA

My ambivalence about the substantive issues hasn’t really changed: I am still as deeply troubled by how online education enables an increasingly corporatist higher ed even as it provides invaluable access for some students. I still hate that I am contributing to a more impersonal, interchangeably modular, version of education, even as I am proud of my new efforts to engage with students in this flexible, open-ended virtual space.

My ambivalence is genuine and important, and I live with the tension of it as I more or less happily go about my online work. It is a low grade discomfort that informs my choices and practices but which does not disable me. Clearly, I did not need to wait until I had moved past my ambivalence to embrace online teaching, but nor did I need to pretend that those mixed feelings had been resolved. In fact, I think my ethical discomfort is healthy and points to problems within higher ed, a system with failings that, though I am implicated in them, also need to be reckoned with. It would be a disservice to my integrity and to my vocation if I were to paint my criticisms pink and become a mere cheerleader for online education.

On the other hand, I wonder where I would be headed had I remained aloof from online ed out of respect for my supposedly noble ambivalence. I am reminded of a former senior colleague who, in the early days of email, proudly refused to use it. He had all sorts of important, and probably legitimate, gripes: It was too impersonal, too ambiguous, too informal, and so on. But it was evident that his aversion was also rooted in his fear of being unable to master this new game, and being an anti-email crank came to define him. I’ve always hoped that his righteous confidence turned out to be warm company, because as email continued its inexorable march, he became increasingly isolated from his students and colleagues.

Gamification: Seductive gold stars and pats on the back

In the third grade, I was rewarded for being the fastest to complete a series of long division problems on the blackboard. My prize, a Flintstone’s eraser, wasn’t even a good likeness of Dino, but I carried it with me for weeks. These days the reward I crave is the happy jingle from my iPad when I’ve completed the daily New York Times crossword. My awareness that I’m only sort of joking when I admit it’s my favorite song helps explain my ambivalence at incorporating similarly trivial rewards into my own classes. Frankly, it’s a little embarrassing to be so eager for such superficial affirmations.

Gamification, using elements of reward and friendly competition to encourage effort and engagement, is both simple and intuitively appealing. That it effectively lights fires — at least in some learners — is clear enough. Nudged onward by the promise of leveling up or of earning a virtual ribbon, we do sometimes perform more diligently and enthusiastically with these dangling carrots in sight. And so I created a badge icon for students who improve their quiz scores, one that automatically pops up on these users’ home pages. I plan to add consistency and perseverance badges as I seek more ways to exploit these easily implemented gamification strategies.


I’ve become willing to experiment with such cheap tactics partly because of my own recent experience as an online student; I was surprised by the tiny thrills of satisfaction I came to anticipate as my badges appeared. And I suspect that gamification has a similarly primal effect, not only on millennial video gamers, but on many of us who earned prizes as children: for the number of books read, a class spelling bee, or a math club competition. But I also know that some experts caution against linking worthwhile activities to crass rewards, noting that, for example, children may no longer color for sheer enjoyment when prizes become part of the mix. While this consequence might not be so worrisome for straightforwardly “outcome-based” courses, it would be anathema for teachers intent on cultivating joyfully authentic life-practices such as close reading and thoughtful discussion.

So, even as I create the release conditions for my virtual badges, imagining my students’ pleasure at receiving them, I’m a little sheepish. Is this all just a tawdry gimmick? Am I trying to bribe these precious human companions with trivial ego boosts, coaxing them to learn material that, as it happens, actually has both intrinsic value and relevance to their lives? Am I reinforcing a consumerist, credentialist view of learning as merely extrinsically valuable, with grades and prizes to be collected in exchange for a diploma and job? They are urgent questions for me because I’ve never meant for my students merely, or even primarily, to learn “information” or discrete “skill sets” associated with my “content area.”

As I continue to explore using badges and other rewards, I remind myself that what I’m up to — leveraging behaviorist elements of learning without sacrificing the ethos of learning for its own sake — is a very old pedagogical conundrum. It certainly didn’t arise with online teaching, even if online modalities have made us more self-conscious about the perils and promises of gamification. In online classes, the affinity of gamification to electronic gaming becomes obvious. And, of course, we all know, or imagine we do, how addictive and empty that activity can be. But, again, some of my most enduring memories as an elementary school student in the 70’s, long before Super Mario or Minecraft, also involved “gamification.” And they are memories that, for better and worse, still bring me vibrations of shame and satisfaction.

As a child, I was motivated by the promise and fear of prizes awarded and withheld, but this probably also compromised my ability to take learning risks because I did not want to be a loser. Gamification, then, is complicated and fraught, and it occurs to me that I should use it more thoughtfully. What if, for example, I invited students to explicitly reflect upon their own perceived susceptibility or aversion to gold stars and pats on the back? Could gamification then become a tool for deeper self-reflection and whole-person development? After all, much of life occurs against a competitive backdrop, a humming swirl of conditional, often arbitrary, ego affirmations and insults. A little more awareness of what’s driving the quest for that promotion, that house, or that anti-wrinkle cream is probably not such a bad idea.

Claiming the right to make beauty: Inspiration, motivation, and basic worthiness

Like lots of the kids around me in my humble Midwestern elementary school, I started playing a band instrument just because. Because the instruments were shiny and mysterious and because it meant being singled out as special three days a week to converge in the lunchroom for a cacophonous 45 minutes. I chose the trumpet because it seemed a magnificent luxury, like something from Cinderella, and because my brother had started playing one a few years before, so I figured my parents had to say yes to me too.

Just to be perfectly clear, I chose neither band nor this particular instrument because I loved music or the sound of brass. In fact, all the way through high school, I continued to plug diffidently away at the trumpet as if it were any other task, like making my bed or mowing the lawn. At no point — neither in practice at home nor public concerts— do I ever recall being moved by the actual experience of making music. Instead, I played out of habit and because it was something I’d agreed to do, giving it just enough time and energy to avoid totally embarrassing myself.

IMG_4705

I ponder this now, because here in the throes of middle age, I have picked up the trumpet once again. It’s a used student model, very much like the one I had decades ago, cold heavy brass that is both strange and familiar in my adult hands. The scent of valve oil and the chill circle of the mouthpiece against my (still) slightly crooked front teeth propel me backwards in time, reminding me that I am both the same and different from the kid who once ran the chromatic scale with such habitual mediocrity.

Shockingly, after just a few months, I find that, in one important sense, I’m already playing better than I ever did as a distracted kid. Adult-me, it seems, is motivated by an actual desire to make actual music. Though I rarely have an audience, I find myself making an effort to play with heart, drawn to the promise of making beauty with my mouth, breath and hands. The irony is that, having fully embraced the low stakes amateurism of playing the trumpet late in life, I am actually getting good at it, at least by my admittedly low standards. And I know this is because playing has become more about creating meaning than about merely mastering a skill set in order to operate a shiny machine.

My childhood failure to connect to the music-making aspect of playing the trumpet was, no doubt, due partly to a relative lack of cultural or artistic appreciation in my working class home. Like most of the kids around me, I grew up almost completely incapable of taking my creative potential seriously. It pretty much never occurred to me that I might be able to make beautiful music or art, because I simply could not fathom being special or worthy enough to approach these rarified realms. Journalism? Maybe. Poetry? Never. Why open myself to ridicule, then, by exerting steady and sincere effort to achieve something so impossibly far out of reach?

IMG_0983

I am left now with an incisive pedagogical lesson that I suspect most everyone else already knew: In many subject areas, especially those associated however obliquely with high culture, U.S. working class kids may never make it out of the starting gate. After all, admission price for even the bare possibility of genuine learning is a basic sense of one’s own belonging in the grand humanistic scheme of things. And how can those who cannot take themselves seriously as potential cultural creators ever embrace the requisite vulnerability? We must feel sure enough that we belong to throw ourselves into it again and again, failing spectacularly, without being overwhelmed by imposter syndrome or falling into what Tara Brach calls the “trance of unworthiness.”

In short, it’s pretty clear that great pedagogical potential is unleashed when we plug into our own sense of cultural self worth. Though the energy that flows from such cultivated aesthetic self-regard may be no more magical or mysterious than electricity, it can be just as transformative. It can mean the difference between a lifetime of stepping self-consciously and disjointedly from one note to another and one spent making bonafide music. Permission to take oneself seriously as a human creator, then, can nudge the sidelined outsider into the heart of the ballroom, into the chaotic dance with the muses that has long nourished the human soul.

Moving past shame: When regret becomes an ally in the classroom and in life

Admitting that we wish we’d done things differently has come to be seen as a mark of spiritual immaturity. Perhaps as a reaction to the guilt-inducing traditional religions of childhood, many have adopted a policy of embracing whatever has occurred as a way of celebrating the present moment. While banishing regret may be fine as an absolute orientation towards the deepest meaning of life — on this view, what IS is good precisely because it is — on a mundane level, I think regret can be a useful ally.

Regret is especially relevant to me as a professor in the twixt time between the fall and spring terms. I look back on Fall with one eye as I look ahead to Spring with the other. The invitation to ruthlessly inspect my courses, to locate both the gems and dross, the tangled thickets and the open clearings, is too loud to be ignored. But still so close to the beauty and the wreckage of classes I’m just now completing, my vision is both sharpened and distorted. Learning to take a critical perspective on a past that is only just barely past demands that I move quickly away from defensive self-justification and make friends with regret.

Specifically, constructive regret requires that I be:

  • secure enough in my identity as a competent teacher that I can afford to have been mistaken about this or that; insecurity about my basic ability will lead me to defend and justify rather than honestly scrutinize;
  • invested not just in improving this or that particular skill or product, but in growing as a whole human being. Then, the motive towards general excellence can become habitual and irresistible; if I am satisfied with coasting dumbly along, either as a teacher, or as a moral, intellectual animal, then I won’t be motivated enough to make deep, lasting changes in any part of my life, including my teaching.

If I can make room for constructive regret in my teaching life — if I can see that that one assignment, the one I really loved, turned out to be a flop — then maybe I can also have a freer, more responsible relationship with the people and events that make up my whole life. If I can see failures — large and small — as messengers, and avoid identifying with them, then I can take better advantage of regret. Seeking and finding my own missteps and shortcomings — like consulting a map at a rest stop — can increasingly become a neutral habit rather than a shaming interlude that I avoid at all costs.

The pitfall of regret, then, is that it can so quickly become an implement for ruthless self-flagellation. One’s personal history and insecurities rise up so powerfully that the prospect of being vulnerable to self-examination becomes intolerable and so, instead, one moves fluidly into self-justification and rationalization. “I had to do it that way, because…” we tell ourselves, instead of authentically reflecting on the details of our motives or the consequences we set into motion. Rationalization becomes as automatic as a gag reflex, neutralizing the natural curiosity that would have us inspect and learn from our past.

There isn’t much that we do, whether in our classrooms or our larger lives, that absolutely had to be precisely the way it was. In most cases, we had viable alternative routes. Whether it’s about permitting a student to make up a quiz or speaking harshly to the person we love most, we can usually have done otherwise. And though we cannot, of course, know absolutely what the future would have been, our limited capacity to anticipate the consequences of our actions should, I think, sometimes lead us toward regret. How can we, I wonder, become more at home in the lively, tense knowledge that we could have, and perhaps should have, done it differently?

9683341B-885C-42CC-8497-C1EB167DEF15

“Just be thankful you’ve got a faculty position”: the abuse of gratitude in the academy

We know we’re supposed to be grateful. It’s a year-round pressure that culminates on Thanksgiving and New Year’s Eve: to count our blessings, look on the positive side, and remember how very fortunate we are. It’s even become a sort of medical prescription, with mental health professionals claiming that gratitude is the key to happiness, long life, and success. I don’t doubt it, but I also recall Karl Marx’s warnings about apparently anodyne feel-good ideologies that function like opium to help keep workers, including professors, cowed and complacent.

Even before the puddle of cranberry sauce dries on my plate, then, I think about how injunctions to be grateful, including those that come from oneself, can become fodder for quietism and bland self-satisfaction. When I consider, for example, the salary hit I will take as the result of huge increases to my insurance, I vacillate between relief — my situation is still much better than that of most people in the U.S. — and anger. How long am I supposed to suck it up and smile as my standard of living is eroded so that fat cats can get even fatter? Am I to compare myself only to those worse off than I am to avoid feeling, and being perceived as, elitist?

A9262420-04A7-4A55-BF3E-70E303BB40B5

This gratitude double-bind is familiar, including to those of us in higher ed. On the one hand, we are aware enough of how tough times are to be grateful for full time faculty jobs. After all, this is an environment in which endangered faculty positions are being hunted down and casually ground into cheap instructor labor. And we mid-career professors watch with horror and sadness as newly minted PhDs continue to roll off the academic assembly line with little prospect of finding jobs half as secure as those we enjoy. We watch as the dignity of our profession is stripped away and, unless we are utterly obtuse, we can’t help but feel gratitude for our own good fortune.

But we are rightfully critical too, and aware of the distance between where public higher education is and where, in a prosperous, enlightened society, it might be. We wince and gnash our teeth at polls reporting that Republicans blame higher ed for the nation’s woes, and we see the writing on the wall. Whatever the future of public higher education holds in store, it is hard to believe it will survive in a form most academics would recognize or prefer.

8573A721-0665-47E8-90BB-8785BA6C0D08

Gratitude, then, like so many spiritually tinged notions, is double-edged. On the one hand, it is a vitally necessary and beautifully human impulse. Surely there is no one more miserable or pathetic than one who constantly complains, the perennial victim who is unable to access any sense of appreciation or agency. But in the quest to be that optimistic, spiritual person, it can be tempting to settle permanently into the narcotizing arms of gratitude, especially when others are urging us to “lighten up” and “count your blessings.” We desperately need, though, the sort of vigorous social protest that often emerges from visceral, contagious dissatisfaction.

If I am to be grateful, then, let me be fiercely, and not complacently, so. Let my gratitude for my own good fortune galvanize me into fighting for the same benefits for others that I now enjoy. Let me freely express my discontent and desire for a better world, impelled by appreciation for what is beautiful and good in my life, and not to be shamed into silence by fear that I will be seen as just another whining, overindulged academic.

Selling the university to student customers: The elusive fantasy of bespoke education

At lots of struggling universities these days, part of the new game plan for attracting students revolves around the values of flexibility, choice, and individual preference. The notion that each student is unique, with gifts, challenges, and whims that ought to be accommodated is consistent with what a contemporary, consumer oriented social ethos seem to require. And I will not raise my voice against such a noble ideal.

Too often, and for far too long, young people have been slotted into institutions, majors, jobs, and even sexual identities, in ways that discourage exploration and experimentation, and ultimately stifle growth. Reshaping the university so that it encourages greater individual variation, including self-designed majors and flexible graduation goals, seems all to the good. And if students and their families arrive with fistfuls of tuition dollars to purchase the newly promised freedom and individual attention, fine.

OLYMPUS DIGITAL CAMERA

Except, of course, that this bargain isn’t entirely on the up-and-up. The student-centered, personalized education model remains elusive at large, bureaucratic institutions. Student advising and counseling offices are understaffed. Online classes may accommodate students’ pajamas and busy work schedules, but most have assignment deadlines more or less like their face-to-face counterparts. Further, at universities like mine, core education requirements continue to push students along fairly scripted paths. The Titanic simply wasn’t built to allow a few passengers to take side trips to explore uncharted coves or deep sea chasms.

University marketing materials may feed the fantasy of one-on-one student engagement with research-active faculty, but students are more likely to encounter overtaxed adjunct instructors. Many of these folks barely have time to floss their teeth let alone lie around on the quad debating Plato’s Republic. Fully employed tenure-track or tenured faculty too face increasing demands as our numbers dwindle and bureaucratic service demands increase. The pressure to respond to the endless, growing stream of students’ accommodation requests, mental health crises, and the proliferation of academic progress reporting alone can become overwhelming. In short, campus “look books” sell the image of unharried, student-centered professors even as long term disinvestment in the professoriate guarantees that we exist as an increasingly endangered species.

OLYMPUS DIGITAL CAMERA

Students quickly discover that much of the individual treatment and focused attention they were promised by the campus sales team doesn’t mean that their instructor will know their name or find time to meet with them after their shift ends at Target. The slick promises don’t save students from form letters, inscrutable across-the-board fees and charges, waiting in lines like “everyone else,” or being accountable to all sorts of deadlines. Small liberal arts colleges may be able to pull off the boutique educational experience, but larger institutions seem to have conformity and batch processing steeped into their bones. And, evidently, many universities simply have no intention of putting their money where their mouths are when it comes to making good on those feel-good promises.

And so the mad rush to attract an ostensibly shrinking body of students threatens to devolve into a bait and switch scheme worthy of fly by night used car lots. When students actually arrive on campus with demands for accommodations and special consideration they have been promised, overtaxed professors, advisors, counselors, staff, and adjunct instructors will be left to tell them the truth: This mid-tier directional university you have selected is far less like a bespoke tailor shop than a Nordstrom Rack or TJ Maxx. There is quality to be found, to be sure, but don’t expect it to be hand selected, gift wrapped, and placed lovingly in your lap.

Those dazzling students who affirm our professorial egos

Like lots of academics who, for one reason or another, operate at the margins of professorial respectability, I have long been suspicious of flash in the pan brilliance. You know, the supernova variety of intelligence that bursts forth and then disappears, bored by the prospect of showing up every day like a regular, reliable old sun. We are, of course, trained by popular culture to adore and admire the hot dogging hero — so often a young white man —the one for whom it all seems to come so easily, and to eschew quiet competence.

It wouldn’t matter as much if it were only students who fell prey to such tropes, but we professors sometimes do too, leaning in toward the students and colleagues who are charismatic, clever, and dramatically incisive like sunflowers toward daylight. Almost everyone is attracted to the glitzy show pony, it seems, the flashy lead singer, the dramatic outlier. We shouldn’t wonder, then, that so many of us internalize the message that it is better to be divinely gifted and prone to sporadic displays of brilliance than a responsible team player, ever prepared and willing to do the work, and make space for others to do theirs.

I suspect that most of us know at least one professor who is susceptible to the rakish senior, the young man who breezes in to class, only sometimes having actually done the reading, but so glibly literate that his insights shine like the lights on a Ferris wheel. And maybe we can’t entirely help it, especially at universities that attract so many students who are underprepared, undermotivated, and exhausted from full time jobs at Target or Chili’s. It would actually make sense if we professors were even more likely to fall for the charms of the carelessly agile intellectual acrobat than anyone else.

After all, don’t many professors imagine themselves to have been so casually brilliant and precocious? The trope of the effortlessly talented intellectual superstar may fit into the personal mythology of some professors even better than it does for students. And don’t these same professors often delight in vaguely claiming credit for such amazing students even if they arrived at the door already poised, confident, and capable, even if they were unwilling to approach one’s class with anything beyond low-level diligence? At the same time, of course, professors may be just as likely to blame the university for the disappointments and failures of other, far less impressive, students.

It probably goes without saying that an ongoing attachment to the raced and gendered Hollywood trope of the lone genius ultimately serves no one, especially if we remain the slightest bit ignorant of our own susceptibility to it. We probably consciously know that we’re supposed to prize the workaday heroes, the invisible majority who show up with no expectation of applause or adulation. We are mature and enlightened enough to have figured out that most productively creative productive people in any field really are not those who burst forth like an occasional geyser when the mood strikes them.

But like crows, we can still be distracted by shiny objects, especially when they might serve to decorate our own egos. That is, to reassure us that, despite appearances, we really are making a dramatic intellectual and pedagogical impact and not merely treading mediocre water at a second-tier directional state university. We may even fall in love with the superstars, OUR superstars, at least a little bit, not because of the contributions we imagine they will make to the world but because of the affirmation they provide to our own identities. At universities where professors mostly complain about underprepared and under committed students, what better proof can I offer of my own exceptionalism than a dazzling acolyte?

Let’s take those anti-college Republicans at their word

Maverick educator though he was, Plato’s Socrates fretted about a new fangled technology known as writing. Relying on quill and papyrus, he worried, could wreck men’s memories and send his beloved Athens into a spiral of dull-witted decline. His concerns seem quaint, even silly, until we consider the recent Pew Center Report suggesting that most Republicans now think that college is bad for society. Certainly, it captures something about the red-blue divide since, at the same time, 70ish percent of “liberals” still think higher ed is pretty nifty.

On the one hand, there’s nothing to see here. Conservatives, especially religious fundamentalists, have long made a hobby of vilifying education, aware enough of its radicalizing potential to pursue radical means to control it. After all, Socrates died for his supposed heresies, to say nothing of poor Tycho Brahe, the long house arrest of Galileo, and the beatings inflicted on enslaved Africans learning to read. There are, unfortunately, endless examples of outraged conservatives silencing intellectuals and creatives in the name of God and country. The current anti-intellectualism in the U.S. too is grounded in a values divide with unbearably high stakes, including attitudes and policies about climate change, the rights of people of color, women, and immigrants, and what it means to be a free citizen.

If this weighs especially heavy on my mind, it is partly because I am a professor from a red town in a red state in a flamingly red region. I am, ostensibly, a veritable case study of the kid who went off to college and emerged unrepentantly and permanently dangerous to society. For anyone who thought I should have married a local boy, become a P.E. teacher (my mother’s early vision for me), and raised a few blond kids, college did, in fact wreck me. From the moment I arrived on campus — supported and encouraged by my father and step mother — worlds opened, intellectually, creatively and socially. Although I avoided the freshman weight gain, college helped me expand in every other respect. New paths led to new roads of experience and perspective that made me and my hometown ever stranger to one another. There was never much chance I would return to it or that it would welcome me if I did.

IMG_4705.JPG

The narrative of the Republican far right — with much help from the kajillionairre Koch brothers and their ilk — is that colleges are left-wing cults, inculcating young people into extreme political liberalism and libertine lifestyles. And I guess the supposed divide between the values of small town America and the dangerous “college type” is perfectly realized in me, a lesbian in a Subaru who eats organic, reads a ton — I am a philosopher — and hasn’t set foot in a church since MC Hammer rocked those iconic pants. As a professor who teaches such “politically charged” courses as LGBT Studies and Queer Theory, I am the poster child of what conservatives object to about higher ed today, a threat to their very way of life.

Except that, as an independent-minded critic of unearned social and economic privilege, my hard working father helped radicalize me long before I went off to college. And my uneducated mother’s eclectic and open-minded approach to friends, food and books set me up to embrace the ideological and aesthetic challenges I encountered on campus. Anyone who blames college for ruining me has no idea how annoyingly philosophical and incipiently political I already was before college had its way with me. It’s probably just as fair to say that college made me a more mature version of myself than that it fundamentally changed me. I suspect this is true for most college students though, of course, I can’t say. But it does seem that those extreme, anti-college Republicans both underestimate and overestimate the influence that the experience has on actual young people.

Anyone truly surprised by this “new” anti-college stance underestimates the power and tenacity of America’s grand tradition of anti-intellectualism, its ties to religious fundamentalism, and the impact of economic disparity and the public disinvestment in higher ed (which is, of course, partly a product of anti-intellectualism). When one adds in the concerted anti-college media campaigns of college-educated fat cats, it is a miracle that all of red America is not disgusted by professors like me. And, no surprise, it turns out that it’s mostly the non-college educated Republicans who are so vehemently against it, like those home-bound Americans who insist with great authority that Europe is overrated. The way to get more popular support for college, as for most worthwhile experiences, is almost certainly to make it more available which is, perhaps, partly why so many Republican fundamentalists fight to make it inaccessible.

IMG_0970

At any rate, this current flare of anti-intellectualist religious fundamentalism does us professors and society great harm. It can result in our being harassed, fired, and much, much worse. But what it cannot do is compel us to reason with it, or, in hand-wringing fashion, to psychologize it in some pseudo compassionate attempt to understand those benighted red-staters. We need not debase ourselves or our critics by second guessing or applying deeper motives to such proud ignorance. There is nothing shameful about ignorance, of course, but I can say with perfect ease that the proudly ignorant should damn well be ashamed of themselves. Though I can strive to understand the climate deniers, conspiracy theorists, and the new crop of flat-earthers as a sort of sociologist or anthropologist might, it is not as one citizen respectfully engaging with another in healthy, authentic dialogue.

The fundamentalists burned Tycho Brahe at the stake, but they could not compel him to make apologies for their murderous behavior. If the Republican fundamentalists wish to scapegoat higher ed, then let’s college types respect them enough to take them at their word. We do them no favors by talking about them or to them as if they were children or fools to be placated. Such pious “understanding,” of course, is the very bleeding heart liberal strategy that they despise. Instead of trying to argue with them about how awesome we are, we should continue to do our jobs well and focus on higher ed accessibility. Those who go to college may not fall in love with the ivory towers and ivy-covered walls, but very few will leave concluding it is professors, or knowledge itself, that is responsible for the rising tide of greed, nastiness and national insecurity.