Archive for ‘teaching’


“Carving deep blue ripples in the tissues of your mind…”

I’m glad I went to college when I did; I get the sense that campuses have become less hospitable to eccentrics who seldom publish but thrive in the classroom. Perhaps the glut of job-seekers is to blame, or the dependence on adjuncts, or management priorities right out of the home office of Walmart. But I once knew a professor who hoped we would see that education could be bigger than all of that, and I was saddened to learn that he has, as Thomas Malory wrote of King Arthur, chaunged his lyf.

The right kind of student loved his classes. He urged us to rip our massive anthologies in half to make the world’s great literature that much more portable. He had us draw maps of mythical places, and he bombarded us with comic strips, song lyrics, modern poems, anything to convince us that knowing this stuff—and he did call it “this stuff”—let us form profound connections with our fellow humans, living and dead. When we read the Aeneid, he pumped us up by blasting Cream’s “Tales of Brave Ulysses” from a boombox and banging his head in psychedelic bliss—but then the frivolity ended as he passed around a tiny vellum manuscript in Greek and quietly asked us to consider both its fragility and its durability.

The last of the fanatic generalists, he taught ancient and medieval lit, the Bible, the Romantic poets, Shakespeare, and the Arthurian legend, but he had a special fondness for the Beats. He also loved Samuel Johnson, and I’m sure that when he went to London every few years, he roamed the alleys and streets with an 18th-century mental map. I don’t know if students see his like anymore: an outspoken liberal who defended the worth of the Western canon. He did so devoutly but without chauvinism: he also studied Japanese and joined his wife in an Indonesian gamelan ensemble.

In 1992 I was mulling over two improbable careers: cartoonist and medievalist. When I popped by his office to talk about graduate school, my prospects hung in the air for ages.

“If that’s really what you want to do, then of course I’ll write you a letter,” he said at last, “but I would be just as pleased to know that I helped to create a very literate cartoonist rather than another academic scrounging around wondering where the next pittance of grant money is going to come from.”

I was stunned to hear a professor suggest that campus life was anything other than a bower of bliss. I don’t know if he accurately perceived my eccentricities or was giving voice to his own disenchantment, but he was right to make me suspicious of the whole business. Decades later, I still make up my career as I go along. With no clear path to follow, life has been harder, and maybe I worry more, but I’ve also traveled more, written more, known more kinds of people, and stumbled more often onto unforeseen luck. More wide-eyed students should hear what I heard; it takes years to sink in.

That same year, I answered his call for a research assistant, an offer he rebuked. “It’s mindless work,” he grumbled, instead sending me home for the summer with an Arthurian tote bag: Malory in Middle English, Layamon’s Brut, and hundreds of pages of secondary sources ranging from credible archaeological studies to wackadoodle theories about the “real” King Arthur. Lacking any guidance or goal, I worked out my own mental outline of medieval Britain. I later built a ten-year teaching job on that.

When he organized a major conference on medievalism, he told me to check it out. The invitation itself was a compliment, but I was too callow to realize that such an event on my own campus was something I ought to attend. A few years later, he sent me the published proceedings, which started me thinking. I wander, I stall, but I do tend to get where I’m going. Did he know?

He could be frustrating. The forms I needed signed and the letters I needed written couldn’t compare to the brilliant conversations with Cavafy and Boswell that seemed always alive in his mind. More than once, he got into deep trouble with fussy little bureaucrats. I like to think he angered them by taking seriously the proposition that a university was a place to explore, to experiment, to gain perspective that makes you free in ways that the world can’t suppress.

We didn’t know each other well, but we shared stories about growing up in tight neighborhoods with large extended families. I hadn’t seen him in 24 years, but now and then a package would surprise me: boxes of books, a cache of poems, letters that rang with good cheer even in the face of failing health.

Good teachers leave you gifts long before you understand their value. Shortly before I graduated, he read my paper on an ephemeral modern author and congratulated me on work that was well-written and cohesive. Then he looked me in the eye and said, by way of farewell: “Study something lasting.” And so I have.


(Polaroid Land Camera photo of a grotesque near the University of Delaware campus)

“Ah, you are in your prime, you’ve come of age…”

“Outreach” is the kale of academia: everyone agrees it’s healthy, but they’re not always eager to make it a part of their lives. My hat is off, then, to Richard Utz, a scholar of medievalism at Georgia Tech, for his willingness to ride out to the market square and kick around big questions about the state of his field. A few weeks ago, the Chronicle of Higher Education published part of the plenary speech Utz delivered in May at the International Congress on Medieval Studies. I’ve been moving truckloads of books to a new home in the country, so this is my first chance to dig into the piece. Despite the stupid title the editors gave it—“Don’t Be Snobs, Medievalists”—it’s a worthy start, even if I found myself cuisse-deep in the questions it raises.

Utz writes:

It is clearly time to lower the drawbridge from the ivory tower and reconnect with the public.

One way to do this is to intervene aggressively in the media when the French National Front appropriates Jeanne d’Arc, New Hampshire legislators feel textually beholden to the Magna Carta, British politicians combat contemporary jihadism with a late medieval treason law, or Prince Philip is appointed to a knighthood of the Order of Australia, a title the illustrious heritage of which dates back to ye olde 1975.

What does it mean to “intervene aggressively”: stand on the drawbridge and denounce sinful readings of history? By what criteria? It’s not a question of accuracy: Utz links to stories about European nationalists, British Conservatives, American Republicans, and cranky Prince Philip (as if they’re all the same) but later he praises the Society for Creative Anachronism. The SCA and its members have done tremendous work in material culture, folklore, and martial arts, but as an organization whose mission is often informally characterized as creating “the Middle Ages as it should have been,” it also has a fantasy wish-fulfillment faction, and it redacts a vital force in medieval culture: religion. Is that not at least potentially a problem for academia? Does the group get a pass from the Medievalist Police because they’re nicer or generally more liberal? I don’t want (or trust the proponents of) a medievalism that seeks to justify every facet of liberalism any more than one that serves as a conservative catechism or nationalist blueprint.

Even so, Utz sees promise in meeting at least certain elements of the public on their own turf:

Add these efforts together, and we medievalists might extricate ourselves from the isolationist confines of 19th- and 20th-century medieval studies and embrace a broader and more egalitarian mélange of academic and popular medievalisms. If we join ranks with the so-called amateurs, we will ensure a continued critical as well as affective engagement with medieval culture. In the process, we might revivify our discipline and contribute to the health of the humanities.

I respect Utz’s aims, but I’m skeptical of his plan. In the past eight years, I’ve written more than 160 blog posts about medievalism, a few of which have gone, if not viral, at least naggingly bacterial, including one about a Charlemagne quote from an Indiana Jones movie that’s drawn tens of thousands of readers. I’ve written both a middle-school textbook and a moderately successful midlist pop-history book about Charlemagne. I’ve given talks about Charlemagne at libraries, museums, and book festivals. I’ve promoted a book of medievalist poetry inspired by a Gothic cathedral. I’ve translated a Middle Scots romance and published shorter translations here on the blog and in scholarly and literary journals. I’ve even dabbled in applied paleobromatology and shared my clunky efforts at retro, medieval-themed instant photography. I did these things not to advance an academic career but because the Middle Ages provided a rich matière for the creative work that occupies my spare time—but if I had done these things as a scholar engaged in public outreach, or if academia had paid more attention to me, would it matter?

Utz writes as if the scholarly world is not just doomed, but scarcely deserving of survival:

The Society for Creative Anachronism has added more to our knowledge of medieval culture by practicing blacksmithing, re-enacting the Battle of Hastings, and performing historical dance than D.W. Robertson’s decision, albeit substantiated by learned footnotes, that all medieval art was created and needs to be read according to the principles of patristic exegesis. Similarly, Michel Guyot’s megaproject of rebuilding a medieval castle, Guédelon, from scratch over a 30-year period, based on 13th-century building plans and without modern technology, yields infinitely more information than another 50 essays obsessing about the authorship of the anonymous Nibelungenlied or Cantar de Mio Cid. Moreover, sites like medievalists.net and publicmedievalist.com communicate valuable information more effectively to academic and nonacademic audiences than dozens of academic journals accessible at subscribers-only sources like JSTOR or Project Muse.

Scholars have indeed failed to bushwhack through old-growth clichés to reach the public; the late Norman Cantor identified the problem more than 20 years ago. But Utz points out an important and underappreciated supply-and-demand clash:

[T]here is now a manifest discrepancy between the large number of students who request that we address their love of Harry Potter, Lord of the Rings, Game of Thrones, and medieval-themed video and computer games on the one hand, and the decreasing number of medievalists hired to replace retiring colleagues on the other.

When I was an adjunct, the director of the English department started me off with one medieval lit course and laughed at my hope that there’d ever be more. In the decade that followed, student demand let me revive the other three medieval courses in the catalog. Now that I’m outside Utz’s drawbridge, I wonder if there shouldn’t be less talk about impressing the public and more effort to win over university bureaucrats, especially lapsed humanities scholars who act like they’re managing a Walmart distribution hub.

I also wish Utz had clarified what he means when he says that students “request that we address their love” of the popular media of the moment. Do they want professors to pontificate about their favorite TV shows? That strikes me as a disheartening waste of brainpower and money—but my hope is that they want something more. Speaking as a kid whose medieval interests were partly rooted in childhood enthusiasm for fantasy games, I’d urge Utz and his colleagues to promise wonderful new realms to their students: history that illuminates human nature, the keys to unlocking eldritch languages, artistic and theological glimpses into the medieval mind—uncool things that endure deep within us long after entertainment companies neglect their latest love-child.

Utz alludes only briefly to “the health of the humanities.” I wish these discussions weren’t always so polar, with academia on one end and TV and video games on the other. What about other eclectic, unaffiliated souls? I’ve met or discovered the work of several such people: Lex Fajardo, author of Kid Beowulf, a series of all-ages graphic novels inspired by his love of world epics; Nancy Marie Brown, the admirably prolific author of mass-market books about the Lewis Chessmen, the Vikings, the Eddas, and Pope Sylvester II; remarkable medieval-inspired poets like Maryann Corbett and Becky Gould Gibson; or novelists like Tod Wodicka. I wonder: What would they do if they came to a scholarly conference? Would it still be a scholarly conference? Would scholars support them right back? Just as those retiring medievalists aren’t being replaced, writers and artists are watching their audiences fragment and shrink. The larger culture doesn’t care, but those of us who have never felt entirely at home on either side of the drawbridge would welcome new allies in seeking the true and the real. Sometimes it’s nice not to lurk in the moat.

“But Lorca’s corpse, as he had prophesied, just walked away…”

How did art become irrelevant? Michael J. Lewis’s answer to that question in Commentary magazine made the rounds of social media last week. It’s an exhaustive overview, and a political one, edged with fine anger, a reminder that the arts used to be merely elitist, not ruthlessly hermetic.

So I was startled to open the latest issue of the literary magazine The Dark Horse and find “Poetry as Enchantment,” an essay by former NEA chairman Dana Gioia that makes many of the same points Lewis makes, but solely about poetry, and with a far more subdued tone. Defending poetry as a universal human art with roots in music, charms, and incantations, Gioia recalls that not long ago, it was ubiquitous and widely enjoyed. I remember that too: my grandfather was a machinist with a grade-school education, but he could rattle off snippets of verse that I now know were the work of Longfellow, Joyce Kilmer, and the (utterly forgotten) Sam Walter Foss.

What happened? Gioia argues that poetry was too well taught. The New Critics imposed reason, objectivity, and coherence on it. “Learning” poetry was reduced to dissection and analysis, then demonstrating your fluency in each new school of critical theory:

For most students, writing a critical paper does not inspire the same lifelong affection for poetry that memorization and recitation foster. When analytical instruction replaces the physicality, subjectivity, and emotionality of performance, most students fail to make a meaningful connection with poetry. So abstracted and intellectualized, poetry becomes disembodied into poetics—a noble subject but never a popular one. As the audience for poetry continues to contract, there will come a tipping point—perhaps it has already arrived—when the majority of adult readers are academic professionals or graduate students training for those professions. What is the future of an art when the majority of its audience must be paid to participate?

No one intended the decimation of poetry’s audience or the alienation of the common reader. Like most environmental messes, those things happened as accidental by-products of an otherwise positive project.

Gioia literally marketed Kool-Aid as an executive at General Foods—who can ever forget his award-winning “fear in a handful of dust” campaign?—so when he became NEA chairman in 2003, he wanted measurable results:

We decided to start with a program that could be executed quickly on a large scale without a huge investment. What we conceived was a national poetry recitation contest for high school students that would begin at class level, and then move on to school, local, state, and national competitions. We successfully tested the idea in Chicago and Washington, D.C., but when the agency tried to expand it, the arts education officials in the 50 states initially refused to adopt it.

The state arts education experts had four major objections to the program. First, they believed that students hated poetry. (Am I wrong to suspect that this assumption suggests that the experts themselves disliked poetry?) Second, they maintained that memorization was repressive and stifled creativity. Some of them added that memorization victimized minority students since standard English was not spoken in their homes. Third, they unanimously felt that competition had no place in the arts. There should be no winners or losers. In arts education, everyone should win. Finally, there was a general feeling among the educators that poetry was too intellectual for the average student. It was not an accessible art.

Just how wrong were those “state arts education experts”? Gioia found that kids raised with hip-hop took to poetry when it became about hearing and reciting rather than reading and analyzing; they loved competing; “problem kids” turned out to be great at it; and immigrant kids have turned out to be around half of all winners each year.

Gioia is too gracious to gloat. About his detractors, he says only this: “The administrators and arts consultants were openly astonished by the program’s popularity.” I wonder why they doubted him: His longtime championing of old-fashioned formalism? His corporate background? His presumed political affiliation? It’s a dreary state of affairs when ignorance is the most charitable explanation.

Someone recently quipped on Facebook that it’s actually a great time to be a poet, because when an art has zero social cachet, the people who do it out of sheer love don’t have to wonder if others are into it for the wrong reasons. That may not be forever true: Gioia’s Poetry Out Loud program has already engaged 2.5 million high-school kids, and books like the Disney Channel poetry anthology are bracing their younger siblings. What if channeling rap fandom into national recitation contests actually entices the corpse of poetry to sprout and bloom some year? Relevance would uproot academia; it wouldn’t be kind; it would set poets slogging through swamps of conflict and commerce, not noticing how many more people had finally learned that they’re meant to talk about what Gioia calls “mysteries that lie beyond paraphrase,” their inheritance as human beings.

“Yeah, proof is the bottom line for everyone…”

In 1994, Norman Cantor was gearing up for his fourth year of besiegement after the release of Inventing the Middle Ages, a mass-market book in which he sought to show how the formative experiences of certain twentieth-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a lively but little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” by holding up the movie Robin Hood: Prince of Thieves as “a highly significant core defeat” the academy hadn’t even known it had suffered:

It shows how little the academic medievalists have made an impact on popular culture and its view of the medieval world. Costner’s Robin Hood signifies social failure for the Ivy League, Oxbridge, and the Medieval Academy of America. But I expect the august personalities in those exalted precincts never gave a moment’s thought to this connection.

I recalled Cantor’s smart, spirited (and, in retrospect, debatable) rant when I read last week’s Chronicle of Higher Education piece by Paul Dicken, a philosopher of science who’s keen to write for popular audiences despite the sneering of colleagues and peers:

Yet as I struggle on with my apparently misguided endeavors, I sometimes think that maybe the search committee had a point. It is difficult pitching academic material in a way that is suitable for a popular audience. I don’t pretend to be an unparalleled communicator of ideas, nor do I kid myself about my ability to produce pithy and engaging prose. After many years of writing for peer review, I have developed a nasty habit of overusing the passive voice — not to mention the usual reliance upon jargon, excessive footnotes, and the death by a thousand qualifications that undermines any attempt to state a clear, precise thesis. It is definitely a learning process. But no matter how dull the final product, I was at least confident that I could express my ideas clearly. That’s what we’re trained for, right?

I’ve known plenty of scholars who write lucid books and blogs; I doubt the academy nurtured the requisite skills.

When I decided to start writing in earnest, I drove wildly around England and Wales collecting material for travel stories. The Washington Post published two of them, but only after an editor nudged me with notes like this one from 1999:

I don’t think this lede works; it’s too slow and diffuse for our reader—imagine a bagel-eating Sunday morning householder, an occasional traveler seeking a weekly fix of travel infotainment—but surrounded by a pile of other sections tugging at his time, and household things about to start tugging too…this is different from someone who settles in for a long night with a New Yorker and a hot toddy.

A good editor knows how to improve and refine our writing without shearing off all of the frills and frippery we vainly adore. Thanks to that guy and a couple others like him, I sloughed off three-and-a-half years of bad grad-school style and (eventually, arguably) learned how to write. Paul Dicken, stick to your plan: keeping readers engrossed in weighty matters without overusing the passive voice or condemning them to “death by a thousand qualifications” doesn’t require “an unparalleled communicator of ideas.” Just know your audience, then decide what you’re doing is, among other things, art.

* * *

We’re overdue for great shifts in our obsolete cultural coalitions; the creaking we hear as they seize up and fail is also the venting of truths. In another Chronicle of Higher Education piece last week, philosopher and science historian Lee McIntyre decries the recent “attack on truth” that he believes has us ambling into “an age of willful ignorance”:

It is sad that the modern attack on truth started in the academy — in the humanities, where the stakes may have initially seemed low in holding that there are multiple ways to read a text or that one cannot understand a book without taking account of the political beliefs of its author.

That disrespect, however, has metastasized into outrageous claims about the natural sciences.

Anyone who has been paying attention to the fault lines of academic debate for the past 20 years already knows that the “science wars” were fought by natural scientists (and their defenders in the philosophy of science) on the one side and literary critics and cultural-studies folks on the other. The latter argued that even in the natural realm, truth is relative, and there is no such thing as objectivity. The skirmishes blew up in the well-known “Sokal affair” in 1996, in which a prominent physicist created a scientifically absurd postmodernist paper and was able to get it published in a leading cultural-studies journal. The ridicule that followed may have seemed to settle the matter once and for all.

But then a funny thing happened: While many natural scientists declared the battle won and headed back to their labs, some left-wing postmodernist criticisms of truth began to be picked up by right-wing ideologues who were looking for respectable cover for their denial of climate change, evolution, and other scientifically accepted conclusions. Alan Sokal said he had hoped to shake up academic progressives, but suddenly one found hard-right conservatives sounding like Continental intellectuals. And that caused discombobulation on the left.

“Was I wrong to participate in the invention of this field known as science studies?,” Bruno Latour, one of the founders of the field that contextualizes science, famously asked. “Is it enough to say that we did not really mean what we said? Why does it burn my tongue to say that global warming is a fact whether you like it or not? Why can’t I simply say that the argument is closed for good?”

“But now the climate-change deniers and the young-Earth creationists are coming after the natural scientists,” the literary critic Michael Bérubé noted, “… and they’re using some of the very arguments developed by an academic left that thought it was speaking only to people of like mind.”

Having noticed, as Norman Cantor did, how rare it is for new discoveries about the Middle Ages to prosper off-campus unless they’re being exploited for linkbait, I was startled by this whole line of thought. I’ll have to read McIntyre’s book to see if it’s true that postmodernist humanities scholars influenced “hard-right conservatives” or “climate-change deniers and the young-Earth creationists.” I doubt it, although I suspect that the latter have at least heckled the former to live up to the credos implied by their critical approaches, but what a remarkable admission: that a fair amount of recent work in the humanities is baloney that was never meant to be consumed, sold, or even sniffed by outsiders.

Humanities theorists have insisted for years that when we set our work loose, it’s no longer our own. They’ll find in the end that intentions still matter: there’s more pleasure and solace in writing and art when you believe what you’re doing is true.

“Unsheathe the blade within the voice…”

Is polysemy now unseemly? Two weeks ago, when historian Steve Muhlberger traveled to that great North American ent-moot, the International Congress on Medieval Studies, he found himself in the midst of “a lot of griping and grouching about the misuse and ambiguity of the word medieval.” In a lucid and laudably concise blog post, he calls out the problem behind the problem:

You would think that a bunch of scholars who by their very nature of their discipline are experts in the evolution of the meaning of words would by now have gotten over the fact that though it doesn’t make a lot of sense to call “the Middle Ages” by that term, and that coming up with a really good, chronological definition of those ages is impossible, we are stuck with the words medieval and Middle Ages anyway. But no . . .

Steve is a scholar of chivalric tournaments and an experienced combat reenactor, so he knows how to land a disarming blow:

This can be intensely irritating for people who know that certain phrases and analyses lost their cogency back in 1927 and want to talk about what their friends are doing in the field now. Nevertheless people whose business is words should really accept the fact that words like “medieval” have a number of popular meanings, and when one of them shows up in current discussion (when, for instance, a Game of Thrones shows up and is widely labelled as medieval, even though the world of Game of Thrones is not our earth at all), the fact can be dealt with a good-humored way. It certainly would reflect credit on any field where a good-humored approach was the norm.

It would indeed. Off campus, the world blissfully resists more than a century of scholarship—pop culture still depicts Vikings in huge horned helmets, for heaven’s sake—and I respectfully suggest that more scholars contemplate why this is so.

As the rare soul who’s read every volume of Studies in Medievalism, I’ve marveled at the field’s mania for nomenclature. Since at least 2009, contributors to the journal—and its sister publication The Year’s Work in Medievalism, and its annual conference, and a pricey new handbook of critical terms—have kicked around the meaning of “medievalism” and “neo-medievalism” until every syllable simpers for mercy. Because I write about medievalism not as a professional scholar but as a footloose amateur, I miss the many years of meaty articles explaining, say, how boys’ chivalric clubs helped inspire the American scouting movement or why we’re perpetually tempted to make Dante a mouthpiece for generational angst. Forged from an accidental alloy of romanticism, nostalgia, politics, religion, and wishful thinking, medievalism can’t help but have jagged edges. It’s tiring to hone terms of art so finely that they cease to exist in three dimensions; we may as well flaunt the imperfection.

When it comes to the matter of the merely medieval, here’s Steve Muhlberger again:

David Parry made the most sensible remark of the entire week when he pointed out that an imprecise word like medieval has a lot of cultural value for people who make their living interpreting that era. Indeed there is a financial payoff being associated with it.

What’s the worth of a timeworn coinage? Steve’s full blog post answers that question, with the suggestion that settling on terms can pay other, less measurable dividends too.

“Well, it seemed to be a song for you…”

Two years ago, I was half-watching the Disney Channel with my nephew and niece when a commercial startled me—not because a fleeting tween sensation had finally done something funny, but because I couldn’t believe they were airing a two-minute promo for poetry. Backstage at a children’s poetry slam, Caroline Kennedy was chatting about her new Disney-backed anthology, Poems to Learn by Heart, without naming a single poem in the book. Naturally, I wondered: What sort of anthology do we get from a network that exalts dancing and singing above all other human endeavors?

As it turns out, a pretty conservative one. Poems to Learn by Heart isn’t the slam-tastic book the commercial makes it out to be; instead, it’s full of traditional, anthology-friendly names: Shakespeare, Byron, Elizabeth Barrett Browning, Stephen Crane, Wallace Stevens, Langston Hughes, Rita Dove, Richard Wilbur—around a hundred poets in all. Adults who want poetry to be “edgy” will find the selection cautious—the wildest poet here is Amiri Baraka, whose “Ballad of the Morning Streets” won’t shock grandma—but Kennedy has less seasoned readers in mind. To her credit, she knows that while most English majors have read poems like “We Real Cool” by Gwendolyn Brooks, most American children (and their parents) have not. She also gets that this book’s 183 pages contain more poetry than most kids will encounter in twelve years of school, so it’s a rare chance to show them what the English language has to offer, from Lewis Carroll to Nikki Giovanni.

Even though Kennedy arranges her selections by subject (“the self,” “family,” “friendship and love,” “faeries, ogres, witches,” “nonsense poems,” “school,” “sports and games,” “war,” and “nature”), Poems to Learn by Heart doesn’t feel guided by a clear editorial point of view. Of course, that’s an adult concern; young readers who don’t yet know their own tastes may enjoy discovering Ovid, Countee Cullen, and Robert Louis Stevenson alongside a Navajo prayer, the Gettysburg Address, the St. Crispin’s Day speech from Henry V, selections from the First Letter of Paul to the Corinthians, and Martin Niemöller’s “First they came for the Socialists” speech. I appreciate breadth, and even the inclusion of lyrical prose, but is it here to foster inclusiveness, or to deflect criticism? One could easily use the table of contents to reconstruct the minutes of Disney’s fretful editorial meetings: Something for the religious? Cultural-literacy conservatives? Social-justice liberals? Native Americans? Check, check, check, and check.

Despite these thoughtful, wide-ranging selections, this book doesn’t always fulfill the promise of its title. Kennedy may be gung-ho for memorization, but I didn’t always see the mnemonic value of her selections: Is “Peace” the one Gerard Manley Hopkins poem to remember? Why learn Shakespeare’s sonnet 94 instead of one of the others? Kennedy asked a six-member poetry slam team at a Bronx high school to help pick these poems, and she devoted four pages to their own passionate free-verse poem about racism, consumerism, child abuse, and mass media. While I hope the publication credit gave their lives a hearty boost, I do wonder, perhaps heartlessly, if their work belongs here. For whom other than the teens who wrote and performed it is it a “poem to learn by heart”?

I was also baffled by the selections in a final “extra credit” section: “Young Lochinvar” by Walter Scott, “Paul Revere’s Ride” by Longfellow, “Kubla Khan” by Coleridge, Robert Service’s crowd-pleasing “The Cremation of Sam McGee,” and the first 18 lines of the General Prologue of The Canterbury Tales. “Mostly they are old chestnuts that have fallen out of favor,” the scion of a privileged political dynasty warns us, lest she come off as a square, “but the feats of memory required to master them will impress even the most modern audiences.” Why can’t the editor of a poetry anthology write as if she actually believes that old things have value beyond their potential for self-exploration and showing off? (And who the heck drops Chaucer on kids without a pronunciation guide?)

That final section highlights this book’s major flaw: a lack of wild, wham-bang narrative. Jon J. Muth’s illustrations are beautiful, but his cover captures the overall mood: gentle, contemplative, dreamy. That’s fine for some kids, but what about action for the more rambunctious? It’s not my style to call for a book to be less intellectual (or for things Disney to be less introspective), but cripes, what about a good, gory chunk of Beowulf or Homer, or an Asian or African epic? Where are the pirates, cavemen, and ghouls of Robert E. Howard? Except in passing in its introduction, Poems to Learn by Heart forgets to teach kids that some of humanity’s best stories are told in verse—and that people proudly carry them around in their heads.

I hate to be hard on this book. For many kids, it will be their only introduction to poetry, and some, I hope, will adore it. Decades from now, if those readers fondly remember this book as adults, the Disney Channel will deserve praise for marshaling its legions of wolf-mounted marketing goblins in support of something more sophisticated than terrible sitcoms—nothing less than Octavio Paz, Seamus Heaney, Paul Laurence Dunbar, Elizabeth Bishop, and Ovid. Only then will I know if Poems to Learn by Heart has served children well or if it’s the century’s first great, unread gift book, a smart, well-intentioned effort to elevate young readers that’s (maybe) too pensive, too mousey, too nice.

“The story is old, I know, but it goes on…”

With its mix of sunshine and harmless bluster, September brings back-to-school nostalgia—ivy-covered professors, that first fall riot, scoldings for being insufficiently euphoric over sports—and perhaps that’s why the past two weeks have swirled with stories about the woes of humanities types in academia. I’ve watched would-be scholars expire en route to the ferne hawle of full professorhood for 20 years, so I’m guessing that many grad students and adjuncts have newly discerned, with the sort of creeping, pitiless dread otherwise confined to Robert E. Howard stories, that they won’t find long-term employment.

First, at the Atlantic, Jordan Weissmann asked why the number of grad students in the humanities is growing. Then, Slate ran a piece about the awkwardness that still hangs about people with doctorates in the humanities who land “alt-ac” careers—that is, jobs where they don’t teach college. Apparently, though, there aren’t enough such lucky people, because a few days later, Salon covered adjunct professors on food stamps.

With all the attention this subject now gets in the press, I can only hope that fewer souls will fling themselves into the hellmouth—but maybe academia shouldn’t have undone quite so many in the first place. While reading about medievalism in recent days, I found two historians who sensed where things were headed long ago.

The first was Karl F. Morrison, who wrote “Fragmentation and Unity in ‘American Medievalism,'” a chapter in The Past Before Us, a 1980 report commissioned by the American Historical Association to explain the work of American historians to their colleagues in other countries. Morrison writes candidly about his field, but he also makes an especially prescient extrapolation, which I’ve bolded:

There was also an expectation in the “guild” that investment in professional training would, in due course, fetch a return in professional opportunity.

By 1970, these benefits could no longer be taken for granted. By 1974, even the president of Harvard University was constrained to deliver a budget of marked austerity, reducing “the number of Assistant Professors substantially while cutting the size of the graduate student body below the minimum desirable levels.” The aggregate result of many such budgets across the country was a sharp reduction in the number of professional openings for medievalists, and an impairment of library acquisitions and other facilities in aid of research. Awareness of this changed climate impelled a large number of advanced students to complete their doctoral dissertations quickly, producing a bulge that is noticeable around 1972-1974 in our tables. For many reasons, including the deliberate reduction or suspension of programs in some universities, it also resulted in a decline in the number of graduate students proceeding to the doctorate.

In effect, the historians who became qualified during this period without being able to secure professional employment constitute a generation of scholars that may be in the process of being lost, casualties of abrupt transition. There is no reason to expect that the demographic and economic trends that so sharply reversed their professional expectations will alter before the end of the century, and this projection raises certain quite obvious possibilities regarding the diversity and renewal of the profession.

Fast forward to 1994. Norman Cantor was gearing up for his fourth year of professional besiegement after the release of Inventing the Middle Ages, a book for non-academic readers in which he sought to show how the formative experiences of certain 20th-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” and noted, in passing, the crummy academic job market and the prevalence of certain “alt-ac” career paths even then:

Within academia a fearful conservative conformity prevails. The marginal employment situation has a twofold negative impact. First, it discourages innovative minds and rebellious personalities from entering doctoral programs in the humanities. People in their late twenties and thirties today with the highest potential to be great medievalists and bridge academic medieval studies and popular medievalism are a phantom army, a lost generation. Instead, for the most part, of climbing the ladder at leading universities they are pursuing careers (often regretfully and unhappily if well-paid) in major law firms.

Second, even if imaginative people take Ph.D.’s in medieval disciplines, they face the job market and particularly once they get a prized tenure track post they encounter a chilling intellectual conservatism that frustrates expressions of their best thoughts and deepest feelings.

I like Cantor’s claim that academia is literally conservative. After all, people are still fretting over problems that he and Morrison noticed decades ago. It’s September 2014, yet Rebecca Schuman at Slate can still write: “The academic job market works on a fixed cycle, and according to a set of conventions so rigid that you’d think these people were applying for top-secret security clearances, not to teach Physics 101 to some pimply bros in Sheboygan.”

The early blogosphere was rife with humanities grad students and adjuncts wavering between disgruntlement and despair; the much-praised Invisible Adjunct rose up to unite them in discussions so civil that I can scarcely believe I saw them on the Internet.

As someone who writes about people who use the imagined past to carve out identities, argue from authority, resist mainstream culture, or seek respite from the real world, I think I understand why the number of new students in arts and humanities doctoral programs grew by 7.7 percent in 2012, but I can’t claim a moment’s nostalgia for the geeky excitement they surely must feel. Morrison and Cantor both imagined a lost generation, but their jobless contemporaries were merely wandering. For this next generation, that luxury is long gone—as is the prospect of claiming that nobody warned them.

“As we get older, and stop making sense…”

English teachers make great idols. Rich kids who can’t pursue their dreams should kill themselves. Such are the awful lessons of Dead Poets Society, a movie I love to hate—not only because real-life English teachers are dubious exemplars, but also because the movie takes too much glee in damning “Dr. J. Evans Pritchard, Ph.D,” the textbook author who supposedly reduces the evaluation of poems to a simple trick of geometry. Not even my worst English teachers would have endorsed the idea, so I assumed such a book didn’t and couldn’t exist—until I discovered the real Dr. Pritchard, but found that he’s hardly as bad as he seems.

When the Dead Poets Society teacher, played by Robin Williams, asks a student to read aloud from a textbook by “Dr. J. Evans Pritchard, Ph.D,” this is what we hear:

 To fully understand poetry, we must first be fluent with its meter, rhyme, and figures of speech, then ask two questions: (1) How artfully have the objectives of the poem been rendered; and (2) how important is that objective? Question one rates the poem’s perfection; question two rates its importance; and once these questions have been answered, determining the poem’s greatness becomes a relatively simple matter. If the poem’s score for perfection is plotted on the horizontal of the graph and its importance is plotted on the vertical, then calculating the total area of the poem yields the measure of its greatness. A sonnet by Byron might score high on the vertical but only average on the horizontal. A Shakespearean sonnet, on the other hand, would score high both horizontally and vertically, yielding a massive total area, thereby revealing the poem to be truly great.

As you proceed through the poetry in this book, practice this rating method. As your ability to evaluate poems in this manner grows, so will your enjoyment and understanding of poetry.

In 1956, Southern Methodist University lit professor Laurence Perrine published the first edition of Sound and Sense: An Introduction to Poetry, which he apparently developed for use in his own classroom. Flip through the book, and there it is, in similar wording, the notion that anthropomorphized a thousand bales of straw:

In judging a poem, as in judging any work of art, we need to ask three basic questions: (1) What is its central purpose? (2) How fully has this purpose been accomplished? (3) How important is this purpose? The first question we need to answer in order to understanding the poem. The last two questions are those by which we evaluate it. The first of these measures the poem on a scale of perfection. The second measures it on a scale of significance. And, just as the area of a rectangle is determined by multiplying its measurements on two scales, breadth and height, so the greatness of a poem is determined by multiplying its measurements on two scales, perfection and significance. If the poem measures well on the first of these scales, we call it a good poem, at least of its kind. If it measures well on both scales, we call it a great poem.

Boo! Hiss! Down twinkles! Blockin’ out the scenery, breakin’ my mind!

Dead Poets Society imagines this infamous passage occurring on “page 21 of the introduction,” but you won’t find it in Perrine’s introduction. Sound and Sense doesn’t have an introduction; Perrine’s shaky effort to quantify taste occurs way in the back of the book—on page 198, in the penultimate chapter, which focuses on learning to spot obviously bad poetry. In real life, Perrine chases this passage with a near-retraction:

The measurement of a poem is a much more complex process, of course, than is the measurement of a rectangle. It cannot be done as exactly. Agreement on the measurements will never be complete. Yet over a period of time, the judgments of qualified readers tend to coalesce: there comes to be more agreement than disagreement . . .

[…]

For answering the first of our evaluative questions, How fully has the poem’s purpose been accomplished? there are no easy yardsticks we can apply. We cannot ask, Is the poem melodious? Does it have smooth meter? Does it use good grammar? Does it contain figures of speech? Are the rimes perfect? Excellent poems exist without any of these attributes. We can judge any element in a poem only as it contributes or fails to contribute to the achievement of the central purpose; and we can judge the total poem only as these elements work together to form an integrated whole. But we can at least attempt a few generalizations.

Of course, all this comes not on the first day of school, but near the end of the course, after an absolute beginner has learned about figurative language, imagery, allusion, tone, rhythm, meter, sound, and pattern—subjects I daresay many English majors can’t discuss competently now.

Still, Perrine/Pritchard is a bit dry, isn’t he? Hasn’t his soul been smothered by tweed? Aren’t his whimsies constrained by the iron cage of reason?

Here’s what “Pritchard,” in his real first chapter, actually says.

Poetry is spiritually vital:

Poetry in all ages has been regarded as important, not simply as one of several alternate forms of amusement, as one man might choose bowling, another chess, and another poetry. Rather, it has been regarded as something central to each man’s existence, something having unique value to the fully realized life, something which he is better off having and which he is spiritually impoverished without.

Poetry lets us live deeply:

Indeed, the two approaches to experience—the scientific and the literary—may be said to complement each other. And it may be contended that the kind of understanding one gets from the second is at least as valuable as the kind he gets from the first.

Literature, then, exists to communicate significant experience—significant because concentrated and organized. Its function is not to tell us about experience, but to allow us imaginatively to participate in it. It is a means of allowing us, through the imagination, to live more fully, more deeply, more richly, and with greater awareness.

Poetry helps us live triumphantly:

We find some value in all intense living. To be intensely alive is the opposite of being dead. To be dull, to be bored, to be imperceptive is in one sense to be dead. Poetry comes to us bringing life, and therefore pleasure. Moreover, art focuses and so organizes experience as to give us a better understanding of it. And to understand life is partly to be master of it.

Poetry is rich:

If it is to communicate experience, it must be directed at the whole man, not just at his understanding. It must involve not only his intelligence but also his senses, his emotions, and his imagination. Poetry, to the intellectual dimension, adds a sensuous dimension, an emotional dimension, and an imaginative dimension.

Poetry can’t be quantified:

You may have been taught to believe that poetry can be recognized by the arrangement of its lines on the page or by its use of rime and meter. Such superficial tests are almost worthless. The Book of Job in the Bible and Melville’s Moby Dick are highly poetical, but a versified theorem in physics is not. The difference between poetry and other literature is one only of degree. Poetry is the most condensed and concentrated form of literature, saying most in the fewest number of words. It is language whose individual lines, either because of their own brilliance or because they focus so powerfully on what has gone before, have a higher voltage than most language has. It is language which grows frequently incandescent, giving off both light and heat.

And that’s just the first chapter! Despite that one iniquitous passage at the end of Sound and Sense, Perrine spends much of the book arguing against the quantification of poetry. At one point, he contrasts the chemical equation for sulfurous acid with the limitless connotations of the word “sulfurous” in a poem. “The poet, we may say, plays on a many-stringed instrument,” he writes. “And he sounds more than one note at a time.” If Perrine often mentions science and psychology, particularly in the chapter on imagery, he does so because he assumes his students already speak those languages. He’s not diminishing poetry; he’s offering novices a way in.

Perrine frequently sounds just as you’d imagine someone who got a Ph.D from Yale in 1948 ought to sound, but I find his old-fashionedness refreshing. “The difference between your figures of speech and the poet’s is that yours are worn and trite, his fresh and original,” he tells his readers, making clear that he’s not some fretful “facilitator,” but the expert in the room. A man of his times, he urges the cultivation of taste through study, scrutiny, and thought—and in his own genteel way, he advocates zeal:

Undoubtedly, so far in this chapter, we have spoken too categorically, have made our distinctions too sharp and definite. All poetic excellence is a matter of degree . . . But a primary distinction between the educated man and the ignorant man is the ability to make value judgments.

A final caution to students. In making judgments on literature, always be honest. Do not pretend to like what you really do not like. Do not be afraid to admit a liking for what you do like. A genuine enthusiasm for the second-rate is much better than false enthusiasm or no enthusiasm at all. Be neither hasty nor timorous in making your judgements. . . . Honesty, courage, and humility are the necessary moral foundations for all genuine literary judgment.

Yes, Perrine can be stuffy. The 1956 debut edition of Sound and Sense contains more than 200 poems, but there aren’t many by women, and as far as I can tell, only one is the work of a non-white poet, Countee Cullen. (Not even Paul Laurence Dunbar? Oh, professor.) Perrine’s idea of a wild, loosen-the-spats, extra-credit challenge? The Love Song of J. Alfred Prufrock. 

Still, the poets in Sound and Sense would make fine desert-island companions—Noyes, Tennyson, Millay, E.A. Robinson, Wilfred Owen, Emily Dickinson, Richard Wilbur, Robert Frost, Carl Sandburg, even James Joyce—and anyway, isn’t academia usually behind the times? My professors in the 1980s and 1990s taught the poets of the 1950s and 1960s as if they were the consummation of poetry itself. They didn’t clue us in to the New Formalism occurring off-campus. Perhaps they weren’t aware of it.

(Sound and Sense is still in print in an overpriced 14th edition. Two editors have updated and broadened the selection of poems—but amazingly, as recently as the 13th edition, the first half of the paragraph that bred “Dr. J. Evans Pritchard, Ph.D” is still stinking up chapter 15! At what point in the past 50 years did that passage get pruned: before or after Dead Poets Society in 1989?)

It’s a shame Perrine has been vilified in fiction, because in chapter 15the chapter with the dreaded Dead Poets Society passage—he begs students to think for themselves, with no histrionic page-ripping or standing on desks. He flings out examples of trite popular verse so students won’t be suckered by sentimentality, rhetoric, didacticism, and cheap appeals to emotion, patriotism, and religion. A truly excellent poem, he says, will be complex and fresh; it “will not be merely imitative of previous literature, nor appeal to stock, pre-established ways of thinking and feeling which in some readers are automatically stimulated by words like mother, baby, home, country, faith, or God, as a coin put into a slot always gets an expected reaction.” He would have liked Roger Ebert’s dismissal of Dead Poets Society as “a collection of pious platitudes masquerading as a courageous stand,” a movie that “pays lip service to qualities and values that, on the evidence of the screenplay itself, it is cheerfully willing to abandon.”

Four years ago, I sat across a conference table from an assistant dean with a Ph.D in the humanities who, with no evident trace of self-loathing, asked me to write bullet points summarizing the “workplace relevance” of medieval literature. (That day I confirmed that the soul really does exist, because I felt mine howling to leave my body.) More recently, I rolled my eyes at the news that a “professor emeritus and former chair of the department of recreation and leisure studies at Southern Connecticut State University” has developed the “Collegiality Assessment Matrix and Self-Assessment Matrix,” which are “designed to clearly assess the level of collegiality of a faculty member.”

This sort of dehumanizing Taylorism thrives in education these days, but you won’t find an endorsement of it even in the final subdued paragraph of Sound and Sense:

Yet, after all, we have provided no easy yardsticks or rule-of-thumb measures for literary judgment. There are no mechanical tests. The final measuring rod can only be the responsiveness, the maturity, the taste and discernment of the cultivated reader. Such taste and discernment are partly a native endowment, partly the product of maturity and experience, partly the achievement of conscious study, training, and intellectual effort. They cannot be achieved suddenly or quickly; they can never be achieved in perfection. The pull is a long pull and a hard pull. But success, even relative success, brings enormous rewards in enrichment and command of life.

Perrine’s conclusion is tepid, but his purpose is profound: He wants you to use poetry to think harder, live better, and feel more deeply. There’s more depth, pleasure, and (every committee’s Questing Beast) “critical thinking” in the stuffiest chapters of Sound and Sense than you’ll find in the latest platitude-sodden government report on the humanities. If you live to defend the value of literature, history, and the arts, turn your Dead Poets Society DVD into a drink coaster and take heart. In real life, “Dr. J. Evans Pritchard” is an ally after all.

“You’ve been in the pipeline, filling in time…”

No medievalism this week. Just some links and comments about the humanities, all of them hanging by a common thread.

* * *

From Do Androids Dream of Electric Sheep? by Philip K Dick, 1968:

“You androids,” Rick said, “don’t exactly cover for each other in times of stress.”

Garland snapped. “I think you’re right: it would seem we lack a specific talent you humans possess. I believe it’s called empathy.”

* * *

From a Chronicle of Higher Education story about Google Glass:

[Assistant professor of journalism and communication] Mr. Littau said he hoped to see further application of Glass in the classroom, although he could not say for certain what else it could be used for.

“It’s a device made for the liberal arts,” he said. “The whole device is about putting you in the shoes of the wearer to experience the world through their eyes. An auto-ethnography in history could be an interesting thing to experience.”

Only in a visually obsessed age would we believe that literally seeing someone else’s point of view qualifies as an experience. If that’s true, We Are All Cops Cameramen Now.

What’s it like to view a work of art through filters other than your own? How does someone with a trained ear experience classical music? How does someone feel, from his forehead to his gut, when his daughter is born, his candidate loses an election, or his childhood home is torn down? God help liberal-arts faculty who need Google Glass to develop empathy. To make that imaginative leap, just find time for reading and thinking—which are analog, and not recent inventions.

 * * *

Here’s a more delightful melding of tech and the humanities: Last week, I found a pocket universe of clever people composing poetry in programming languages.

Experiments with computer-generated poetry aren’t new, but for creative works wrought from the human mind, Perl has apparently been the language of choice. You’ll find poems written about Perl, poetry generators for Perl, Perl poems as April Fool’s jokes, and translations such as “Jabberwocky” rendered in (non-functional) Perl. The go-to text in the field is writer and software tester Sharon Hopkins’ 1992 conference paper and mini-anthology “Camels and Needles: Computer Poetry Meets the Perl Programming Language.”

A Spanish engineer and software developer also put out a call in 2012 for contributors to code {poems}, an anthology of verse in such languages as C++, Python, DOS, Ruby, and HTML. The poems couldn’t just be goofs, though; they had to run or compile. An April 2013 Wired story showcases one of the entries: “Creation?”, a poem in Python by Kenny Brown.

I love this. There’s great creativity here—and a reminder that computers speak only the languages we give them.

* * *

In “Cryptogams and the NSA,” which I’m assuming is not fiction, John Sifton of Human Rights Watch recounts how he was indicted in 2011 after he tweaked the NSA by emailing himself snippets of James Joyce and Gerard Manley Hopkins from a proxy server in Peshawar:

“There are a lot of references to mushrooms and yeast in Joyce,” I said. My attorney touched my arm lightly, but I ran on.

“Look—” I took the book up, “There’s a part late in the book. . . Here, page 613. Halfway down the page.” I pushed it across to Fitzgerald:

A spathe of calyptrous glume involucrumines the perinanthean Amenta: fungoalgaceous muscafilicial graminopalmular planteon; of increasing, livivorous, feelful thinkamalinks; luxuriotiating everywhencewhithersoever among skullhullows and charnelcysts of a weedwastewoldwevild. . . 

“See? Fungoalgaceous muscafilicial,” I said. “It’s a portmanteau of different types of cryptogams.”

The stenographer interrupted here, her north Baltimore accent like a knitting needle stuck in my ear. “Are those words in that book?” she asked, “Because – otherwise you’re going to have to spell them.”

She was waved off by one of the US attorneys.

Fitzgerald read the text, or looked at the letters anyway, and then he looked at me again. A kind, blank, innocent look. Unaware of the fear he was instilling in me, not knowing what he was doing, he suddenly twisted the knife.

“And why would someone write like this?”

My silence now. “Why?” I repeated, meekly. I was devastated.

“Just your opinion. A short explanation.” Absolute innocence in asking the question.

My hands began trembling. One of his assistants looked at the clock.

“I don’t know, sir – honestly I don’t.”

“And why would someone write like this?” Because it’s fun; because it’s artful; because government exists not to perpetuate itself, but to protect these odd, wonderful flourishes of civilization. And because it helps us know who the androids are.

“Cover my eyes and ears, ’til it all disappears…”

“I feel like I spent the day scooping out portions of Mondoville’s memory—lobotomizing an educational institution,” writes Prof Mondo, lamenting a book-cull at his small college library:

We’re getting rid of some 25,000 volumes, somewhere between a quarter and a third of our overall holdings. To be fair, something had to be done. Our building is simply inadequate for our collection, many of the books are obsolescent, and many others hadn’t been opened in years — indeed, a colleague of mine found a set of Thomas Hardy’s works, many of which had unopened pages. The library has been held together with spit and baling wire, thanks to an overworked, underpaid, and insanely dedicated staff.

Furthermore, our students are ever less likely to venture into the stacks. They do their research online, relying on the library’s online databases to find articles and such.

The good prof finds the cull troubling for many reasons, but he ends on this desolate note:

Finally, there was the sense that I was engaged in a kind of intellectual Black Mass, inverting the sacrament that I was meant to perform. I love my students, but I also love the worlds of literature and ideas; indeed, I show my love to my students by offering them these other things I value so much. These books, these ideas in them, matter so much to me that I’m devoting my life to the business of letting those stories and ideas survive another generation. But instead, I spent today making it that much less likely that a Mondovillian might encounter someone’s story or idea, even through a confluence of idleness and serendipity. Education is meant to help the mind grow, and I see libraries as symbols of the growth that has gone before us. Instead, I spent today making our symbol shrink. I couldn’t shake the feeling that this was the opposite of what I do.

Also today, at the Atlantic Monthly, Megan McArdle makes a not-unrelated observation:

Today, according to Amazon, eBooks have surpassed print books entirely; they are selling more Kindle editions than they are selling from all of their print formats combined. Since April 1st, they’ve sold 105 Kindle books for every 100 print editions.

The speed is remarkable, but the outcome doesn’t surprise me.  I buy almost everything for Kindle now, unless it doesn’t have a Kindle edition, or it has lots of pictures that I want to examine in detail.  Which is to say, not many.  Frequently, if it doesn’t have a Kindle edition, I don’t order it at all.

McArdle is generalizing about trends in reading solely from her own experience, but I don’t mind countering with anecdotes of my own.

* * *

For example, if a pundit needed to research the background of the Icelandic financial crisis, the 2010 book Wasteland with Words: A Social History of Iceland might be a boon. Unfortunately, it’s not available as an e-book. Neither is The Islander: A Biography of Halldór Laxness, the first English-language bio of the author who brought Icelandic culture to the notice of the world. A clever pundit might know to allude to his novels.

If you’re dabbling in verse, The New Princeton Encyclopedia of Poetry and Poetics is indispensable (and addictively browseable). Many of its entries contain better, more, or just different information than you’ll find online. This 1,383-page tome has been in print for nearly 20 years, and apparently it still sells well, but there’s no Kindle edition.

For several years, I’ve wanted my students to read Brian Stone’s translation of the Alliterative Morte Arthure. I don’t know why Penguin Classics let it fall out of print. Fortunately, you can buy it used for two bucks or read it for free in hundreds of North American libraries. There’s no Kindle edition.

Last Thanksgiving, I made jawārish, a carrot jam from a 13th-century Islamic cookbook. Published in 2009, Medieval Cuisine of the Islamic World is packed with neat recipes and commentary. There’s no Kindle edition.

* * *

“But wait,” I hear yon straw man cry, “who cares about Icelandic social history? Who but you wants to read an encyclopedia entry about the Ultraism movement in Spanish poetry? And seriously, dude, medieval Islamic carrot jam?”

The digital age is supposed to help all of us pursue our passions and explore our intellectual interests. Ostensibly smart people—journalists, especially—shouldn’t endorse only what’s mainstream or popular or shut out sources of information because they don’t appeal to one’s sense of novelty.

It’s troubling for a pundit at The Atlantic to say, essentially, “If it doesn’t exist for my cool new e-reader, then as far as I’m concerned, it doesn’t exist.” That’s an admission of willful ignorance—and we already have problems with journalists who can’t see beyond their own worlds.

Besides, medieval Islamic carrot jam is tasty.

* * *

“You must be a Luddite!” Guess again, scarecrow. I share my home with thousands of books, but I’m increasingly unsentimental about them. Becoming Charlemagne is doing well on the Kindle, I’ve self-published an e-book of a translation of a medieval romance, I’m reading Ulysses on my smartphone, and I’m in the market for a 10-inch Android tablet for reading and storing academic PDFs. Liking technology doesn’t make you anti-print. You can be pro-both.

* * *

Another rustle from the straw: “Eventually, everything will be online!”

Verily, I say unto you: Are you so positive that we’ll have several more decades of the stability and prosperity required to digitize “everything” that you’ll bet centuries of accumulated knowledge on it?

I fled grad school 13 years ago, but I’d love to be a budding medievalist now, when I can access online dictionaries for Latin, Old English, and Old Icelandic and browse the Monumenta Germaniae Historica without schlepping over to campus. I’m keenly aware of how much progress universities, government agencies, corporations, and museums have made in digitizing material that many dismiss as obscure.

And yet, two years ago, at the National Park Service archive, I glimpsed just how far we have to go. Around 2,000 of the best photos in their historic image collection are online, but their physical archive holds millions of objects, including posters, newsletters, snapshots, and un-photographed doodads like vintage ranger uniforms. The entire collection was overseen by just two employees. When they weren’t scrambling to fulfilling never-ending requests from commercial publishers and calendar makers, they occasionally found a moment to scan some old slides. At this rate, unless a legislator takes up their cause, most of their collection will languish forever in file drawers.

So if you’re a pundit, a historian, or a photo editor and you’re relying on digitized stuff to tell a story, you’re likely spinning the same yarn as everyone else. To tell a bigger story, to show or say something new, you’ll need to push away from the computer and patiently seek out an archive.

* * *

Megan McArdle concludes:

What will happen to the pleasures of pulling a random book from the shelves of a home where you are a weekend guest?

They’ll be replaced by other pleasures, like instant gratification.  And it’s probably more gain than loss.  But I’m just a little bit sad, all the same.

It’s not just about “pleasures.” What about the brainy kid whose parents are either too poor, too disdainful of education, or just too ignorant to give him a Kindle or an iPad? Yes, nearly anyone who wants Internet access can get it, and inquisitive kids are resourceful kids, and the Internet offers brilliant opportunities for intellectual exploration—but there’s no reason to diminish or destroy one convenient, low-tech, time-tested way to feed the brain.

“But you know,” croaks yon straw man, flailing his arms, “it’s expensive to store books in a big building and pay for a staff to maintain them.” Of course it is—but preserving and propagating knowledge is a core function of a college or university. Most American campuses have dozens of costlier programs and facilities that would wither if anyone were challenged to justify their educational merit.

Harvard isn’t trashing a quarter to one-third of the books in its libraries or turning them into glorified Internet cafes. If your college your kid attends is, you may want to ask a dean why they assume their graduates will never compete against kids with big-name degrees. (You might also ask them: “Would you send your child here?”)

* * *

But then why would most people associate libraries with learning anymore? Ads in D.C. Metro stations tout public libraries as places to take yoga classes and hold meetings, and the library system’s website assures the aliterate that a new library “offers more than just books.” (Whew! No one will think you’re a nerd!)

My own neighborhood branch is extremely popular, and the staff is terrific, but when lawyers in million-dollar homes use their library cards to check out government-subsidized Backyardigans DVDs for their kids, we aren’t exactly living the Carnegie dream.

* * *

Maybe there’s hope. In November, I sat in a bayou and beguiled my seven-year-old nephew with the exploits of Beowulf. Last week, by phone, he told me that during a recent visit to the local library, his quest for a sufficiently gory version of Beowulf led him to books about Theseus and the minotaur, the labors of Hercules, and Odin and Loki.

These books may change the course of his life; they may be a fad. Either way, a first-grader in rural Louisiana senses what pundits and college administrators forget: Random access to analog information is a freedom all its own. The Internet is wondrous, and e-readers are great, but if you let technology circumscribe and define your intellectual world, you literally won’t ever know what you’ve missed.