Archive for ‘academic ax-grinding’


“I’m so cool and calculated, alone in the modern world…”

It’s becoming a genre unto itself: the call by scholars of the Middle Ages to invigorate their fields by reaching out to new audiences. In the latest example at The Chronicle of Higher Education, medievalist and English professor Christine Schott asks an evergreen question—”[h]ow can literary scholarship make a claim for its value when its product reaches only the other members of its own narrow field?”—and writes with candor about her work:

Of course I have an interpretive argument about the marginalia I study, and I do not wish to abandon that side of the field either. I am reasonably capable of dressing up my theories about material culture, genre, and self-writing in fancy vocabulary, but I maintain that they are no smarter for being decked out in academic regalia. And when it comes down to it, I don’t want to write scholarship that my friends and nonacademic peers cannot understand.

Schott plucks a painfully abstruse passage from a 1993 book about literary theory and boils it down to a lovely, clear, informative sentence—a rare skill. I’ve considered the rebuttals by humanities scholars who claim that specialized fields need their own patois, and since my career isn’t at stake, I can say that I find those defenses bunk; you can dazzle your colleagues with rarefied terms without writing in a style that makes the rest of us laugh out loud. Schott is wise to be sensitive to outside perceptions:

When I talk to fellow scholars, I might frame my work as “the study of paratextual material in late medieval vernacular scribal culture.” Even I hate the sound of that sentence. Let me offer, instead, the version I gave my Aunt Bea, who once ventured to ask me what I work on. I told her, “I study the things that people wrote in the margins of books in medieval Iceland.” When I said that, Aunt Bea wasn’t exactly impressed, but she did understand exactly what I meant.

Actually, what she said was, “They give Ph.D.s for that sort of thing, huh?” A familiar response from anyone who, like my aunt, works in a nice, practical field like nursing. And yet I get excited by a reaction like hers, because that is a teaching moment.

Schott’s solution is “to write even our scholarly work for a popular audience.” That’s a great idea—but why be so conservative? After all, professionalism hasn’t smothered her joy:

I always launch into a litany of the wonderful things one finds in the margins of Icelandic manuscripts: poetry, proverbs, complaints (my pen is dull, I didn’t get enough fish to eat, my wife is mad at me and it’s not my fault — all real examples). Part of the value of my work as I see it, then, is simple translation: “nu kolnar mér á fingrunum” means nothing to most people. But “my fingers are getting cold” is both transparent and so delightfully human that people often comment on how un-foreign these complaints sound. I don’t think you should have to get an advanced degree to enjoy these little glimpses into long-forgotten lives.

Look at that: the enthusiasm that makes non-scholars light up, the humanism they crave but can rarely describe, and the simple eloquence of someone who is uniquely suited to give them both.

“When I suggest changing our target audience,” Schott writes, “what I’m really talking about is marketing, and we are rightly suspicious of treating intellectual pursuit as a commodity.” Those of us who’ve migrated from academia to writing and the arts understand those concerns. I get tired of hearing that we can’t be only writers anymore, that we need to become experts at marketing and branding. Call it advocacy, then; no one else is standing by to champion us, and clearly there are ways to do it that don’t cheapen your work. Heck, more than two million American teenagers have had a blast with poetry because a former Kool-Aid marketing executive knew when to stop taking and how to start doing.

And so my humble advice to medievalists is this: stop talking about hypothetical outreach and do something. Write a book for a trade press. Spin your scholarly insights into poems. Produce a podcast. Start a blog. Make YouTube videos or Vines or a novelty Twitter account. Stage a play. Lecture at your local Osher center. Pitch articles to trendy media outlets like NPR or The Atlantic. Translate texts for non-scholars. Give the good work of strangers the attention you wish your own were receiving. You decide where to draw your own line. After you stare down a few frowning peers, the way is less fraught than you think: You won’t make enough money to fret about your soul, and you’ll compromise your scholarship only if you pander to your audience or fail to beguile them with the promise of much larger worlds.

I’ve written before that if the circles of scholars, writers, and artists overlapped more than they do, we’d all benefit. Professor Schott sees that we’re in danger of entombment in our own narrow niches:

What is literary scholarship for if not to aid readers in appreciating, understanding, interpreting, and questioning the literature that they encounter? In writing for a tiny coterie of specialists, we may achieve great heights of intellectual pursuit, but we are generally preaching to the choir. If we are not content with our society turning into a post-literary world, then we have some proselytizing to do, to people like my Aunt Bea. That is not marketing, that is teaching.

Indeed it is, and I hope Schott will share her enthusiasm wherever she can. The right blend of scholarship and passion can hearten the rest of us with all the thrilling alchemy of art.

“Ah, you are in your prime, you’ve come of age…”

“Outreach” is the kale of academia: everyone agrees it’s healthy, but they’re not always eager to make it a part of their lives. My hat is off, then, to Richard Utz, a scholar of medievalism at Georgia Tech, for his willingness to ride out to the market square and kick around big questions about the state of his field. A few weeks ago, the Chronicle of Higher Education published part of the plenary speech Utz delivered in May at the International Congress on Medieval Studies. I’ve been moving truckloads of books to a new home in the country, so this is my first chance to dig into the piece. Despite the stupid title the editors gave it—“Don’t Be Snobs, Medievalists”—it’s a worthy start, even if I found myself cuisse-deep in the questions it raises.

Utz writes:

It is clearly time to lower the drawbridge from the ivory tower and reconnect with the public.

One way to do this is to intervene aggressively in the media when the French National Front appropriates Jeanne d’Arc, New Hampshire legislators feel textually beholden to the Magna Carta, British politicians combat contemporary jihadism with a late medieval treason law, or Prince Philip is appointed to a knighthood of the Order of Australia, a title the illustrious heritage of which dates back to ye olde 1975.

What does it mean to “intervene aggressively”: stand on the drawbridge and denounce sinful readings of history? By what criteria? It’s not a question of accuracy: Utz links to stories about European nationalists, British Conservatives, American Republicans, and cranky Prince Philip (as if they’re all the same) but later he praises the Society for Creative Anachronism. The SCA and its members have done tremendous work in material culture, folklore, and martial arts, but as an organization whose mission is often informally characterized as creating “the Middle Ages as it should have been,” it also has a fantasy wish-fulfillment faction, and it redacts a vital force in medieval culture: religion. Is that not at least potentially a problem for academia? Does the group get a pass from the Medievalist Police because they’re nicer or generally more liberal? I don’t want (or trust the proponents of) a medievalism that seeks to justify every facet of liberalism any more than one that serves as a conservative catechism or nationalist blueprint.

Even so, Utz sees promise in meeting at least certain elements of the public on their own turf:

Add these efforts together, and we medievalists might extricate ourselves from the isolationist confines of 19th- and 20th-century medieval studies and embrace a broader and more egalitarian mélange of academic and popular medievalisms. If we join ranks with the so-called amateurs, we will ensure a continued critical as well as affective engagement with medieval culture. In the process, we might revivify our discipline and contribute to the health of the humanities.

I respect Utz’s aims, but I’m skeptical of his plan. In the past eight years, I’ve written more than 160 blog posts about medievalism, a few of which have gone, if not viral, at least naggingly bacterial, including one about a Charlemagne quote from an Indiana Jones movie that’s drawn tens of thousands of readers. I’ve written both a middle-school textbook and a moderately successful midlist pop-history book about Charlemagne. I’ve given talks about Charlemagne at libraries, museums, and book festivals. I’ve promoted a book of medievalist poetry inspired by a Gothic cathedral. I’ve translated a Middle Scots romance and published shorter translations here on the blog and in scholarly and literary journals. I’ve even dabbled in applied paleobromatology and shared my clunky efforts at retro, medieval-themed instant photography. I did these things not to advance an academic career but because the Middle Ages provided a rich matière for the creative work that occupies my spare time—but if I had done these things as a scholar engaged in public outreach, or if academia had paid more attention to me, would it matter?

Utz writes as if the scholarly world is not just doomed, but scarcely deserving of survival:

The Society for Creative Anachronism has added more to our knowledge of medieval culture by practicing blacksmithing, re-enacting the Battle of Hastings, and performing historical dance than D.W. Robertson’s decision, albeit substantiated by learned footnotes, that all medieval art was created and needs to be read according to the principles of patristic exegesis. Similarly, Michel Guyot’s megaproject of rebuilding a medieval castle, Guédelon, from scratch over a 30-year period, based on 13th-century building plans and without modern technology, yields infinitely more information than another 50 essays obsessing about the authorship of the anonymous Nibelungenlied or Cantar de Mio Cid. Moreover, sites like medievalists.net and publicmedievalist.com communicate valuable information more effectively to academic and nonacademic audiences than dozens of academic journals accessible at subscribers-only sources like JSTOR or Project Muse.

Scholars have indeed failed to bushwhack through old-growth clichés to reach the public; the late Norman Cantor identified the problem more than 20 years ago. But Utz points out an important and underappreciated supply-and-demand clash:

[T]here is now a manifest discrepancy between the large number of students who request that we address their love of Harry Potter, Lord of the Rings, Game of Thrones, and medieval-themed video and computer games on the one hand, and the decreasing number of medievalists hired to replace retiring colleagues on the other.

When I was an adjunct, the director of the English department started me off with one medieval lit course and laughed at my hope that there’d ever be more. In the decade that followed, student demand let me revive the other three medieval courses in the catalog. Now that I’m outside Utz’s drawbridge, I wonder if there shouldn’t be less talk about impressing the public and more effort to win over university bureaucrats, especially lapsed humanities scholars who act like they’re managing a Walmart distribution hub.

I also wish Utz had clarified what he means when he says that students “request that we address their love” of the popular media of the moment. Do they want professors to pontificate about their favorite TV shows? That strikes me as a disheartening waste of brainpower and money—but my hope is that they want something more. Speaking as a kid whose medieval interests were partly rooted in childhood enthusiasm for fantasy games, I’d urge Utz and his colleagues to promise wonderful new realms to their students: history that illuminates human nature, the keys to unlocking eldritch languages, artistic and theological glimpses into the medieval mind—uncool things that endure deep within us long after entertainment companies neglect their latest love-child.

Utz alludes only briefly to “the health of the humanities.” I wish these discussions weren’t always so polar, with academia on one end and TV and video games on the other. What about other eclectic, unaffiliated souls? I’ve met or discovered the work of several such people: Lex Fajardo, author of Kid Beowulf, a series of all-ages graphic novels inspired by his love of world epics; Nancy Marie Brown, the admirably prolific author of mass-market books about the Lewis Chessmen, the Vikings, the Eddas, and Pope Sylvester II; remarkable medieval-inspired poets like Maryann Corbett and Becky Gould Gibson; or novelists like Tod Wodicka. I wonder: What would they do if they came to a scholarly conference? Would it still be a scholarly conference? Would scholars support them right back? Just as those retiring medievalists aren’t being replaced, writers and artists are watching their audiences fragment and shrink. The larger culture doesn’t care, but those of us who have never felt entirely at home on either side of the drawbridge would welcome new allies in seeking the true and the real. Sometimes it’s nice not to lurk in the moat.

“But Lorca’s corpse, as he had prophesied, just walked away…”

How did art become irrelevant? Michael J. Lewis’s answer to that question in Commentary magazine made the rounds of social media last week. It’s an exhaustive overview, and a political one, edged with fine anger, a reminder that the arts used to be merely elitist, not ruthlessly hermetic.

So I was startled to open the latest issue of the literary magazine The Dark Horse and find “Poetry as Enchantment,” an essay by former NEA chairman Dana Gioia that makes many of the same points Lewis makes, but solely about poetry, and with a far more subdued tone. Defending poetry as a universal human art with roots in music, charms, and incantations, Gioia recalls that not long ago, it was ubiquitous and widely enjoyed. I remember that too: my grandfather was a machinist with a grade-school education, but he could rattle off snippets of verse that I now know were the work of Longfellow, Joyce Kilmer, and the (utterly forgotten) Sam Walter Foss.

What happened? Gioia argues that poetry was too well taught. The New Critics imposed reason, objectivity, and coherence on it. “Learning” poetry was reduced to dissection and analysis, then demonstrating your fluency in each new school of critical theory:

For most students, writing a critical paper does not inspire the same lifelong affection for poetry that memorization and recitation foster. When analytical instruction replaces the physicality, subjectivity, and emotionality of performance, most students fail to make a meaningful connection with poetry. So abstracted and intellectualized, poetry becomes disembodied into poetics—a noble subject but never a popular one. As the audience for poetry continues to contract, there will come a tipping point—perhaps it has already arrived—when the majority of adult readers are academic professionals or graduate students training for those professions. What is the future of an art when the majority of its audience must be paid to participate?

No one intended the decimation of poetry’s audience or the alienation of the common reader. Like most environmental messes, those things happened as accidental by-products of an otherwise positive project.

Gioia literally marketed Kool-Aid as an executive at General Foods—who can ever forget his award-winning “fear in a handful of dust” campaign?—so when he became NEA chairman in 2003, he wanted measurable results:

We decided to start with a program that could be executed quickly on a large scale without a huge investment. What we conceived was a national poetry recitation contest for high school students that would begin at class level, and then move on to school, local, state, and national competitions. We successfully tested the idea in Chicago and Washington, D.C., but when the agency tried to expand it, the arts education officials in the 50 states initially refused to adopt it.

The state arts education experts had four major objections to the program. First, they believed that students hated poetry. (Am I wrong to suspect that this assumption suggests that the experts themselves disliked poetry?) Second, they maintained that memorization was repressive and stifled creativity. Some of them added that memorization victimized minority students since standard English was not spoken in their homes. Third, they unanimously felt that competition had no place in the arts. There should be no winners or losers. In arts education, everyone should win. Finally, there was a general feeling among the educators that poetry was too intellectual for the average student. It was not an accessible art.

Just how wrong were those “state arts education experts”? Gioia found that kids raised with hip-hop took to poetry when it became about hearing and reciting rather than reading and analyzing; they loved competing; “problem kids” turned out to be great at it; and immigrant kids have turned out to be around half of all winners each year.

Gioia is too gracious to gloat. About his detractors, he says only this: “The administrators and arts consultants were openly astonished by the program’s popularity.” I wonder why they doubted him: His longtime championing of old-fashioned formalism? His corporate background? His presumed political affiliation? It’s a dreary state of affairs when ignorance is the most charitable explanation.

Someone recently quipped on Facebook that it’s actually a great time to be a poet, because when an art has zero social cachet, the people who do it out of sheer love don’t have to wonder if others are into it for the wrong reasons. That may not be forever true: Gioia’s Poetry Out Loud program has already engaged 2.5 million high-school kids, and books like the Disney Channel poetry anthology are bracing their younger siblings. What if channeling rap fandom into national recitation contests actually entices the corpse of poetry to sprout and bloom some year? Relevance would uproot academia; it wouldn’t be kind; it would set poets slogging through swamps of conflict and commerce, not noticing how many more people had finally learned that they’re meant to talk about what Gioia calls “mysteries that lie beyond paraphrase,” their inheritance as human beings.

“Yeah, proof is the bottom line for everyone…”

In 1994, Norman Cantor was gearing up for his fourth year of besiegement after the release of Inventing the Middle Ages, a mass-market book in which he sought to show how the formative experiences of certain twentieth-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a lively but little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” by holding up the movie Robin Hood: Prince of Thieves as “a highly significant core defeat” the academy hadn’t even known it had suffered:

It shows how little the academic medievalists have made an impact on popular culture and its view of the medieval world. Costner’s Robin Hood signifies social failure for the Ivy League, Oxbridge, and the Medieval Academy of America. But I expect the august personalities in those exalted precincts never gave a moment’s thought to this connection.

I recalled Cantor’s smart, spirited (and, in retrospect, debatable) rant when I read last week’s Chronicle of Higher Education piece by Paul Dicken, a philosopher of science who’s keen to write for popular audiences despite the sneering of colleagues and peers:

Yet as I struggle on with my apparently misguided endeavors, I sometimes think that maybe the search committee had a point. It is difficult pitching academic material in a way that is suitable for a popular audience. I don’t pretend to be an unparalleled communicator of ideas, nor do I kid myself about my ability to produce pithy and engaging prose. After many years of writing for peer review, I have developed a nasty habit of overusing the passive voice — not to mention the usual reliance upon jargon, excessive footnotes, and the death by a thousand qualifications that undermines any attempt to state a clear, precise thesis. It is definitely a learning process. But no matter how dull the final product, I was at least confident that I could express my ideas clearly. That’s what we’re trained for, right?

I’ve known plenty of scholars who write lucid books and blogs; I doubt the academy nurtured the requisite skills.

When I decided to start writing in earnest, I drove wildly around England and Wales collecting material for travel stories. The Washington Post published two of them, but only after an editor nudged me with notes like this one from 1999:

I don’t think this lede works; it’s too slow and diffuse for our reader—imagine a bagel-eating Sunday morning householder, an occasional traveler seeking a weekly fix of travel infotainment—but surrounded by a pile of other sections tugging at his time, and household things about to start tugging too…this is different from someone who settles in for a long night with a New Yorker and a hot toddy.

A good editor knows how to improve and refine our writing without shearing off all of the frills and frippery we vainly adore. Thanks to that guy and a couple others like him, I sloughed off three-and-a-half years of bad grad-school style and (eventually, arguably) learned how to write. Paul Dicken, stick to your plan: keeping readers engrossed in weighty matters without overusing the passive voice or condemning them to “death by a thousand qualifications” doesn’t require “an unparalleled communicator of ideas.” Just know your audience, then decide what you’re doing is, among other things, art.

* * *

We’re overdue for great shifts in our obsolete cultural coalitions; the creaking we hear as they seize up and fail is also the venting of truths. In another Chronicle of Higher Education piece last week, philosopher and science historian Lee McIntyre decries the recent “attack on truth” that he believes has us ambling into “an age of willful ignorance”:

It is sad that the modern attack on truth started in the academy — in the humanities, where the stakes may have initially seemed low in holding that there are multiple ways to read a text or that one cannot understand a book without taking account of the political beliefs of its author.

That disrespect, however, has metastasized into outrageous claims about the natural sciences.

Anyone who has been paying attention to the fault lines of academic debate for the past 20 years already knows that the “science wars” were fought by natural scientists (and their defenders in the philosophy of science) on the one side and literary critics and cultural-studies folks on the other. The latter argued that even in the natural realm, truth is relative, and there is no such thing as objectivity. The skirmishes blew up in the well-known “Sokal affair” in 1996, in which a prominent physicist created a scientifically absurd postmodernist paper and was able to get it published in a leading cultural-studies journal. The ridicule that followed may have seemed to settle the matter once and for all.

But then a funny thing happened: While many natural scientists declared the battle won and headed back to their labs, some left-wing postmodernist criticisms of truth began to be picked up by right-wing ideologues who were looking for respectable cover for their denial of climate change, evolution, and other scientifically accepted conclusions. Alan Sokal said he had hoped to shake up academic progressives, but suddenly one found hard-right conservatives sounding like Continental intellectuals. And that caused discombobulation on the left.

“Was I wrong to participate in the invention of this field known as science studies?,” Bruno Latour, one of the founders of the field that contextualizes science, famously asked. “Is it enough to say that we did not really mean what we said? Why does it burn my tongue to say that global warming is a fact whether you like it or not? Why can’t I simply say that the argument is closed for good?”

“But now the climate-change deniers and the young-Earth creationists are coming after the natural scientists,” the literary critic Michael Bérubé noted, “… and they’re using some of the very arguments developed by an academic left that thought it was speaking only to people of like mind.”

Having noticed, as Norman Cantor did, how rare it is for new discoveries about the Middle Ages to prosper off-campus unless they’re being exploited for linkbait, I was startled by this whole line of thought. I’ll have to read McIntyre’s book to see if it’s true that postmodernist humanities scholars influenced “hard-right conservatives” or “climate-change deniers and the young-Earth creationists.” I doubt it, although I suspect that the latter have at least heckled the former to live up to the credos implied by their critical approaches, but what a remarkable admission: that a fair amount of recent work in the humanities is baloney that was never meant to be consumed, sold, or even sniffed by outsiders.

Humanities theorists have insisted for years that when we set our work loose, it’s no longer our own. They’ll find in the end that intentions still matter: there’s more pleasure and solace in writing and art when you believe what you’re doing is true.

“Unsheathe the blade within the voice…”

Is polysemy now unseemly? Two weeks ago, when historian Steve Muhlberger traveled to that great North American ent-moot, the International Congress on Medieval Studies, he found himself in the midst of “a lot of griping and grouching about the misuse and ambiguity of the word medieval.” In a lucid and laudably concise blog post, he calls out the problem behind the problem:

You would think that a bunch of scholars who by their very nature of their discipline are experts in the evolution of the meaning of words would by now have gotten over the fact that though it doesn’t make a lot of sense to call “the Middle Ages” by that term, and that coming up with a really good, chronological definition of those ages is impossible, we are stuck with the words medieval and Middle Ages anyway. But no . . .

Steve is a scholar of chivalric tournaments and an experienced combat reenactor, so he knows how to land a disarming blow:

This can be intensely irritating for people who know that certain phrases and analyses lost their cogency back in 1927 and want to talk about what their friends are doing in the field now. Nevertheless people whose business is words should really accept the fact that words like “medieval” have a number of popular meanings, and when one of them shows up in current discussion (when, for instance, a Game of Thrones shows up and is widely labelled as medieval, even though the world of Game of Thrones is not our earth at all), the fact can be dealt with a good-humored way. It certainly would reflect credit on any field where a good-humored approach was the norm.

It would indeed. Off campus, the world blissfully resists more than a century of scholarship—pop culture still depicts Vikings in huge horned helmets, for heaven’s sake—and I respectfully suggest that more scholars contemplate why this is so.

As the rare soul who’s read every volume of Studies in Medievalism, I’ve marveled at the field’s mania for nomenclature. Since at least 2009, contributors to the journal—and its sister publication The Year’s Work in Medievalism, and its annual conference, and a pricey new handbook of critical terms—have kicked around the meaning of “medievalism” and “neo-medievalism” until every syllable simpers for mercy. Because I write about medievalism not as a professional scholar but as a footloose amateur, I miss the many years of meaty articles explaining, say, how boys’ chivalric clubs helped inspire the American scouting movement or why we’re perpetually tempted to make Dante a mouthpiece for generational angst. Forged from an accidental alloy of romanticism, nostalgia, politics, religion, and wishful thinking, medievalism can’t help but have jagged edges. It’s tiring to hone terms of art so finely that they cease to exist in three dimensions; we may as well flaunt the imperfection.

When it comes to the matter of the merely medieval, here’s Steve Muhlberger again:

David Parry made the most sensible remark of the entire week when he pointed out that an imprecise word like medieval has a lot of cultural value for people who make their living interpreting that era. Indeed there is a financial payoff being associated with it.

What’s the worth of a timeworn coinage? Steve’s full blog post answers that question, with the suggestion that settling on terms can pay other, less measurable dividends too.

“The story is old, I know, but it goes on…”

With its mix of sunshine and harmless bluster, September brings back-to-school nostalgia—ivy-covered professors, that first fall riot, scoldings for being insufficiently euphoric over sports—and perhaps that’s why the past two weeks have swirled with stories about the woes of humanities types in academia. I’ve watched would-be scholars expire en route to the ferne hawle of full professorhood for 20 years, so I’m guessing that many grad students and adjuncts have newly discerned, with the sort of creeping, pitiless dread otherwise confined to Robert E. Howard stories, that they won’t find long-term employment.

First, at the Atlantic, Jordan Weissmann asked why the number of grad students in the humanities is growing. Then, Slate ran a piece about the awkwardness that still hangs about people with doctorates in the humanities who land “alt-ac” careers—that is, jobs where they don’t teach college. Apparently, though, there aren’t enough such lucky people, because a few days later, Salon covered adjunct professors on food stamps.

With all the attention this subject now gets in the press, I can only hope that fewer souls will fling themselves into the hellmouth—but maybe academia shouldn’t have undone quite so many in the first place. While reading about medievalism in recent days, I found two historians who sensed where things were headed long ago.

The first was Karl F. Morrison, who wrote “Fragmentation and Unity in ‘American Medievalism,'” a chapter in The Past Before Us, a 1980 report commissioned by the American Historical Association to explain the work of American historians to their colleagues in other countries. Morrison writes candidly about his field, but he also makes an especially prescient extrapolation, which I’ve bolded:

There was also an expectation in the “guild” that investment in professional training would, in due course, fetch a return in professional opportunity.

By 1970, these benefits could no longer be taken for granted. By 1974, even the president of Harvard University was constrained to deliver a budget of marked austerity, reducing “the number of Assistant Professors substantially while cutting the size of the graduate student body below the minimum desirable levels.” The aggregate result of many such budgets across the country was a sharp reduction in the number of professional openings for medievalists, and an impairment of library acquisitions and other facilities in aid of research. Awareness of this changed climate impelled a large number of advanced students to complete their doctoral dissertations quickly, producing a bulge that is noticeable around 1972-1974 in our tables. For many reasons, including the deliberate reduction or suspension of programs in some universities, it also resulted in a decline in the number of graduate students proceeding to the doctorate.

In effect, the historians who became qualified during this period without being able to secure professional employment constitute a generation of scholars that may be in the process of being lost, casualties of abrupt transition. There is no reason to expect that the demographic and economic trends that so sharply reversed their professional expectations will alter before the end of the century, and this projection raises certain quite obvious possibilities regarding the diversity and renewal of the profession.

Fast forward to 1994. Norman Cantor was gearing up for his fourth year of professional besiegement after the release of Inventing the Middle Ages, a book for non-academic readers in which he sought to show how the formative experiences of certain 20th-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” and noted, in passing, the crummy academic job market and the prevalence of certain “alt-ac” career paths even then:

Within academia a fearful conservative conformity prevails. The marginal employment situation has a twofold negative impact. First, it discourages innovative minds and rebellious personalities from entering doctoral programs in the humanities. People in their late twenties and thirties today with the highest potential to be great medievalists and bridge academic medieval studies and popular medievalism are a phantom army, a lost generation. Instead, for the most part, of climbing the ladder at leading universities they are pursuing careers (often regretfully and unhappily if well-paid) in major law firms.

Second, even if imaginative people take Ph.D.’s in medieval disciplines, they face the job market and particularly once they get a prized tenure track post they encounter a chilling intellectual conservatism that frustrates expressions of their best thoughts and deepest feelings.

I like Cantor’s claim that academia is literally conservative. After all, people are still fretting over problems that he and Morrison noticed decades ago. It’s September 2014, yet Rebecca Schuman at Slate can still write: “The academic job market works on a fixed cycle, and according to a set of conventions so rigid that you’d think these people were applying for top-secret security clearances, not to teach Physics 101 to some pimply bros in Sheboygan.”

The early blogosphere was rife with humanities grad students and adjuncts wavering between disgruntlement and despair; the much-praised Invisible Adjunct rose up to unite them in discussions so civil that I can scarcely believe I saw them on the Internet.

As someone who writes about people who use the imagined past to carve out identities, argue from authority, resist mainstream culture, or seek respite from the real world, I think I understand why the number of new students in arts and humanities doctoral programs grew by 7.7 percent in 2012, but I can’t claim a moment’s nostalgia for the geeky excitement they surely must feel. Morrison and Cantor both imagined a lost generation, but their jobless contemporaries were merely wandering. For this next generation, that luxury is long gone—as is the prospect of claiming that nobody warned them.

“Empty-handed on the cold wind to Valhalla…”

For all the violence the Vikings unleashed, their enemies and victims might find cold comfort in the torments Americans now inflict on them. We’ve twisted them into beloved ancestors, corny mascots, symbolic immigrants, religious touchstones, comic relief—and, this week, proponents of gender equity on the battlefield. The medieval past is grotesque, uninviting, and indifferent to our hopes. We wish so badly that it weren’t.

“Shieldmaidens are not a myth!” trumpeted a Tor.com blog post on Tuesday, sharing tidings of endless Éowyns in the EZ-Pass lane to the Bifröst:

“By studying osteological signs of gender within the bones themselves, researchers discovered that approximately half of the remains were actually female warriors, given a proper burial with their weapons . . . It’s been so difficult for people to envision women’s historical contributions as solely getting married and dying in childbirth, but you can’t argue with numbers—and fifty/fifty is pretty damn good.

Great Odin’s ophthalmologist! Holy hopping Hávamál! Half of all Viking warriors were women?

Alas, no. “Researchers discovered” nothing of the sort—but that didn’t stop wishful linkers from sharing the “news” hundreds of times via Twitter and countless times on Facebook.

So what’s going on here? Besides conflating “Viking” with “Norse,” the pseudonymous author of the Tor.com blog post misread a two-year-old USA Today summary of a 2011 article by scholar Shane McLeod, who most definitely has not delivered forsaken warrior maidens from their long-neglected graves. No, McLeod simply did the un-newsworthy work of reassessing burial evidence for the settlement of Norse women in eastern England in the late 800s, with nary a Brunhilde or Éowyn in sight.

You can find “Warriors and women: Norse migrants to eastern England up to 900 AD” in the August 2011 issue of the journal Early Medieval Europe. If you don’t have institutional access to scholarly databases, the article is imprisoned behind a $35 paywall, which is a shame, because although McLeod’s piece requires a slow, patient read, you don’t need expertise in ninth-century English history or modern osteology to understand it—just the ability to follow an argument about a couple dozen skeletons in a tiny corner of England at a very specific time in history, plus an openness to the possibility that McLeod hasn’t brought your “Game of Thrones” fantasies to life.

Here’s the gist of McLeod’s article, as concisely as I can retell it:

Focusing only on the area of eastern England occupied by the Norse in the 800s, he looks at one sample of six or seven burials from five locations dating from 865 to 878 A.D. where scholars had made assumptions about the sex of the dead based on the stuff buried with them. He compares them to a second sample: 14 burials from five sites (dating from 873 to the early 10th century) where osteologists determined the sex of the dead by examining their bones.

In the first group, only one person was tagged as female. In the second group, between four and six of the dead, perhaps half of the sample, were found to be female, even though based on grave goods, at least one of them might previously have been assumed to be male, because one of those women was buried with a sword. (Ah, but that woman was also interred with a child of indeterminate sex. What if the sword belonged to her young son? And look: someone in the first group who might have been a woman was buried with a sword, too…)

McLeod’s assessment is this: If we scientifically determine the sex of the dead based on their bones rather that assume their sex based on grave goods, we find more evidence (to pile atop existing evidence from jewelry finds) that Norse women came to England with Norse armies, earlier and in greater numbers than previously thought, rather than in a later wave of migration and settlement. Perhaps the men weren’t “a demobbed Norse army seeking Anglo-Saxon wives,” but intermarried with local women in smaller numbers than historians previously believed.

For the lay reader, that’s a disheartening hoard of unsexy conclusions—and a far cry from the Tor.com blogger’s claim, mindlessly brayed across social media, that “Half of the Warriors were Female.” It’s fantasy, not scholarship, and certainly not science, to interpret one woman buried with a sword, maybe two, as evidence for Norse women in combat.

Shane McLeod deserves better. Working with limited data pried out of ninth-century crevices, he recognizes that his sample size is tiny, that it’s tough to identify burials as “Norse” for sure, and that his findings are only “highly suggestive.” He’s precise, tentative, and conscious of counter-arguments, and he seems willing to go wherever the evidence takes him. His biggest accomplishment, however, is highlighting a major scholarly error. Experts who made assumptions about male versus female grave goods failed to reassess the biases they project backwards onto the Middle Ages—even though doing so is one of the traits even the most pop-minded academic medievalists will often claim distinguishes them from the duct-tape-sword-wielding masses.

Likewise, science-fiction fans are forever congratulating themselves for holding the right opinions on such subjects as evolution, but this time they lazily succumbed to fannish fantasies, failing to question a claim that deserved to be pummeled by doubt. I’ve done tons of social-media copywriting, so I get why that blogger just wanted to throw something out there after a holiday to beguile weekend-weary eyeballs—but come on.

Science doesn’t always tell us what we want to hear. Truth demands nuanced consideration of evidence, and reason demands skepticism, neither of which flourish on social media—so if you shared or re-tweeted the Tor article, congratulations! This week, in the name of medievalism, you made the world stupider.