About Scott Eric Kaufman
Scott Eric Kaufman is an English graduate student at the University of California, Irvine. He earned his B.A. from Louisiana State University and hopes to earn his Ph.D. sometime before his funding evaporates. He had one fancy title: Senior Instructor of Literary Journalism. He is currently working tirelessly on his dissertation. His scholarly interests include everything—and he means everything—pertaining to American literature and appropriations of evolutionary theory c. 1890-1910. His blather can also be read on The Valve.
Posts by Scott Eric Kaufman
Tuesday, February 10, 2009
Medievalists can now be as lazy as Americanists.
“Searching for medieval manuscripts gets you millions of hits, most of which have nothing to do with manuscripts, and when they do, they usually feature only images of a single page rather than the entire book,” said Matthew Fisher, an assistant professor of English at UCLA.
Fisher set out two years ago to remedy the situation. With the assistance of two graduate students in English, a computer developer from UCLA’s Center for Digital Humanities and Christopher Baswell, a former UCLA professor of English, Fisher decided to collect links to every manuscript from the eighth to the 15th century that had been fully digitized by any library, archive, institute or private owner anywhere in the world.
Medievalists with a paleological or codicological approach now have their own Google Book Search. (And Carl Pyrdum now has more grist for his award-winning mill.) When I started the dissertation, Google Book Search was in its infancy and practically useless; by the time I filed, I could check variant editions of the novels I discussed in my first two chapters. But even before the final rewrites, I’d found myself forever indebted to some anonymous lackey at the University of Michigan: my third chapter turned on the sudden availability of every word Silas Weir Mitchell via Google instead of ILL, and my fifth wouldn’t have been completed had I not been able to skip from one edition of Pudd’nhead Wilson to the next. (The differences aren’t significant. I’m simply paranoid.)
(Full disclosure: I’m married to someone who works with someone involved in this.)
Sunday, February 08, 2009
The rubble of a prolonged catharsis.
Checking up that “proud cock” is two words, not one, so that I might strut it out, I came across the entry for “beast language“ in the Dictionary of Poetic Terms. The diction of the entry is either unconscionably vague or deliciously derisive:
Theoretically, it seems the language of the poem moves backward from conceptual words to an intelligence of pure sound utterance that calls attention to the form, meaning, and derivation of the word. The form is probably meant to be experienced more than intellectually understood, an experience that presumably links the reader with his own senses and the animal origins of language.
Since my knowledge of post-1940 American literature amounts to what I’ve read for pleasure and the Lord set Sunday aside for whimsical researches, I decided to find out whether the entry ought to be applauded or condemned. First up, according to Google Book Search, was a work I inadvertantly committed to the memory hole: William Gass’s The World Within the Word. In “Food and Beast Language,” Gass talks about something else entirely; namely, the forced materiality of Henry Miller’s prose. Of its effect on Miller’s signature subject, Gass concludes:Continue reading "The rubble of a prolonged catharsis."
Thursday, February 05, 2009
I’m fairly certain they meant that as a compliment.
Via io9, I learn that one of our authors “deserves more literary fame as a post-modern trickster.” I wonder who that could be? (I also learned that I’m about three novels behind. I blame the dissertation.)
Sunday, February 01, 2009
On the pitfalls of stylistic uniformity, Part I
I should begin by thanking my drive-by insult-smith for reminding me what I’d written about Gene Wolfe four years back, because it should’ve been the foundation of the Updike post. In that earlier post, I claimed Wolfe suffered from
something that needs a better name (or an agglutinative one) than “brilliant-one-trick-pony syndrome.” What I mean is a stylist who employs the same breathtaking style in every single thing he or she writes. Some would accuse David Foster Wallace of being one such stylist. But his novels, shorts and essays are focalized through a variety of characters. Because each of his characters speak with a unique voice, his overall style remains heterogeneous despite his penchant for footnotes and sentences of Faulknerian length and complexity. (Brief Interviews With Hideous Men works as a perfect litmus test: no one “interviewee” sounds like any of the others or, for that matter, Hal from Infinite Jest.)
Gene Wolfe, however, suffers mightily from brilliant-one-trick-pony syndrome. It doesn’t influence my impression of any one, two or three of his novels, but once some critical point has been passed the cumulative effect of his prose stylings falters before the law of diminishing returns. [Consider] The Fifth Head of Cerberus. It begins when Severian—I’m kidding. Severian isn’t in this collection. But he could be if you judged by narrative voice alone.
Here’s our own Adam Roberts, responding to that point in 2005:
I was very struck by [Scott’s] original point about stylistic monotony. It’s not that Wolfe is a bad writer, but that he is a writer incapable of changing (or perhaps disinclined to change) his writing style. There is a flatness to reading long stretches of Wolfe; and I don’t just mean late Wolfe, the Long Sun and Short Sun books where he falls lazily back on endlessly elaboration couched in the form of dialogue. The whole corpus: it’s all so stylistically monologic.
Some writers develop a laziness born of talent: once they’ve mastered their idiom, they elaborate on their strengths instead of confronting their weaknesses. Here’s Adam responding to a very similar stylist today:
That’s where his genius was—his extraordinary, fluent, particularised style; the way he evoked the specificity of detail. But one of the things that follows from this is that his larger artistic project stands or falls on whether we consider the details adequate to the business of representing experience. Updike’s whole corpus is a way of answering this question with: they are; indeed, there’s really nothing more than the details. His stuff is overwritten, but in the way the Ode: to Autumn is overwritten. Of course you may feel that a writer needs something more than the details; that s/he needs a panoramic ability, or at least a larger vision. But I’m not so sure. Which is to say; I wonder if, when we look back on the second half of the twentieth century, we won’t find ourselves saying: that was the age in which people became queerly obsessed with details and minutiae, and the larger patterns faded from public consciousness. If so, then Updike captured the spirit of the age better than almost anyone.
I think Adam’s correct: Updike’s precision limited the kinds of narratives he could tell to the kind of narratives he told. If an author aims to catalog the lives of a particular class of people without sounding like An American Tragedy-era Dreiser, aping Updike is the way to go.
Before this becomes a general tussle about whether form governs content or content dictates form, I’ll say that self-conscious stylists like Updike don’t make for good generalizations. The argument here should be whether Updike, consciously or otherwise, selected his subjects because they complemented his style; or whether his style dictated who he could write about.
The almost unbearable flatness of his female characters leads me to believe the latter. His precision depends upon an intimacy a person can only have with him- or herself. Had he been more self-conscious about the phrasing of the words he put in other people’s mouths, his narrators wouldn’t have spoken so similarly. Had he been more self-conscious about the tenor of the thoughts he stuffed in other people’s heads, his narrators wouldn’t have been dogged by the same fears to the same ends. Put differently: Updike worked to refine the voice he knew he had, and why not?
It was a magnificent tool for telling the stories it led him to believe he wanted to tell.
(Part II to follow.)
Thursday, January 29, 2009
Must we still pretend to like John Updike?
For the moment, I’m going to pretend I’ve never read an entire novel by John Updike and judge his literary legacy on the basis of one paragraph singled out as representative of the awfulness of his prose. The passage, we are told, typifies his habit “vacillat[ing] from the tedious to the atrocious,” scoring “somewhere between Thomas Hardy and Kate Chopin on the soporific scale,” and reads thus:
Men emerge pale from the little printing plant at four sharp, ghosts for an instant, blinking, until the outdoor light overcomes the look of constant indoor light clinging to them. In winter, Pine Street at this hour is dark, darkness presses down early from the mountain that hangs above the stagnant city of Brewer; but now in summer the granite curbs starred with mica and the row houses differentiated by speckled bastard sidings and the hopeful small porches with their jigsaw brackets and gray milk-bottle boxes and the sooty ginkgo trees and the banking curbside cars wince beneath a brilliance like a frozen explosion.
There’s much Updike wrote I won’t defend—Toward the End of Time deserved the slagging it received—but for Young Master Shapiro to choose, from a hefty body of work, the opening paragraph of Rabbit Redux to bury Updike beneath should stand as the object lesson in why movement conservatives whose tastes range from Forsythe to Uris ought not be writing about literature. I’m loath to even defend it, as it needs no defense, but here goes:
Men emerge pale from the little printing plant at four sharp, ghosts for an instant, blinking, until the outdoor light overcomes the look of constant indoor light clinging to them.
Heavy alliteration on the “p” plays to the plodding of the pale people who emerge from the printing plant. The sentence turns on a dime, dropping the alliteration and transforming the men into “ghosts for an instant.” That instant lasts the space of the following comma—the blink—and the blinking strips them of their ghostliness. Needless to say, “ghostliness” describes a thing one is, not a quality one has, but Updike’s inverting the effect here—the men appear ghostly to each other as their eyes adjust to the light, but Updike would have us believe they become ghostly, only to rematerialize as daylight strips the indoor light from their bodies.
In winter, Pine Street at this hour is dark, darkness presses down early from the mountain that hangs above the stagnant city of Brewer; but now in summer the granite curbs starred with mica and the row houses differentiated by speckled bastard sidings and the hopeful small porches with their jigsaw brackets and gray milk-bottle boxes and the sooty ginkgo trees and the banking curbside cars wince beneath a brilliance like a frozen explosion.
More inversion: Updike opens with the dark wintry mood in a clause that hangs above everything after the semi-colon the way the mountain “hangs above the stagnant city of Brewer.” The sentence then shifts into a higher gear. We know Updike can set off dependent clauses with a comma—he did it with “in winter”—so when he lets “but now in summer” fly, we feel the acceleration as he speeds through those conjunctive clauses right into a “frozen explosion.” Not that I want to sound like a student—“the way the author uses diction”—but look at the way the author uses diction here: the stolidity of the “granite curbs” is undermined by mica starring it; the aspirations of the small porches dashed by a pervasive grayness; &c. Only, not &c., if you follow Shapiro’s logic
I am sorry, but reading books is what I do, and I have read literaly [sic] thousands of them. That first paragraph of Updike’s on this post is absolute garbage. It is unbelievably pretentious, it is riddled with ridiculous adjectives, and it is as though he is a bad poet trying to sound avant garde and choosing words indiscriminately out of a Thesaurus.
The misspelling, comma-splicing, German-nouning man could not be anymore wrong. There may be one too many words up there, but I doubt they came from a thesaurus. (Because believe you me, I know from thesaurus.)
Tuesday, January 27, 2009
Lesson Planning 101: How to teach film responsibly in a composition class
Lest you think I’m publishing a long introduction-to-film-studies-type scene analysis for no reason, I had a few English people ask me how I taught film after I posted my syllabus last quarter. So I thought as long as I’m prepping for Tuesday anyway, I might could help a few folk out. I’m not an expert in film theory, so if you’re looking for something along those lines, I suggest you head over to yonder blog and consult its illustrious roll. But if you want a workmanlike approach to teaching basic film vocabulary in a composition class, you could do worse. (Albeit not much.)
Because I’m one of those cultural studies loons who believe that popular means culturally significant, the film I’m teaching is The Dark Knight. The scene I’ve tasked my students to analyze begins 1 hour and 24 minutes into the film. (The link points off-site because I couldn’t coax The Valve into posting those pictures correctly.)
Monday, January 12, 2009
Who dare not call this literature?
Monday, January 05, 2009
Teaching the Overdetermined Image
As anyone who teaches funny books or films knows, the task of convincing students that the scene before them is anything other than incidental would try Job’s patience. You show them a panel from the surprisingly awful Superman and Batman vs. Aliens and Predator like, say, thisContinue reading "Teaching the Overdetermined Image"
Sunday, January 04, 2009
It’s always already been the end of epic film.
Whether he knows it or not—and “he” being Adam Kotsko, I’ll bet he knows it—this Weblog post is less about the formal fit between epic and the television serial than the relation of film to the episodic form. I know that sounds backwards—what with MOVIES! being PRESENTED! on SCREENS! the SIZE! of WYOMING!—but the compounded facts of run time and the modern American attention span necessitate we consider film the proper realm of the self-contained episode. Even films which promise sequels announce their completion in terms of whatever -ology they embrace.
Films should be about something in the original, locative sense of the word. They should surround some subject matter, be “on every side” “wholly or partially,” as per the OED. They should be self-contained. Not that they shouldn’t be sweeping—you can frame Guernica or a sublimely panoramic view of the Hudson River and slap it on a gallery wall without robbing them of sweep—but they should recognize their formal limitations. Films can only intimate narrative epicness. They can’t achieve it.
“But But But!"
Try me. Start listing epic films and I’ll start listing films with grandiose tableaux. The Lord of the Rings? Shot in that sewer of New Zealand. Blade Runner? The Lord himself envies Ridley Scott’s matte painters. With film we confuse the formal qualities of narrative epic for the GIANT! SCALE! presented by the movie screen. Cases in point: Iron Man and The Dark Knight.
Both were hailed as epic upon release, and yet both are far superior films on the small screen. Before you ask: I do remember what I wrote about The Dark Knight on IMAX, and inasmuch as it relates the experience of watching an obscenely high-quality image projected on the side of an eight-story building, I stand by it. Watching the film on a small screen—one on which a bug of a Batman glides between five-inch tall skyscrapers while Heath Ledger’s Joker licks human-sized lips and establishes human-sized eye-contact—it’s impossible to deny that this supposedly epic performance is better suited to the televisual medium. (This goes doubly for Iron Man, which barely passes for “good" on the big screen but shines when we connect with Robert Downey Jr. as a human actor in corporate world.)
Not that I think we should deny that the serial drama is also better served on the small screen. A solidly written, solidly acted television show can be a better film than most films. To wit: having finished the first four episodes of the blogosphere’s own Leverage, I can’t help but wonder what went so terribly wrong with Ocean’s Twelve and Thirteen.
Thursday, January 01, 2009
Congratulations, Mr. Bady
Aaron nabbed the 2008 Cliopatria Award for “Best Writer.” I say “nabbed,” but in truth, he earned it—I wouldn’t have asked him to contribute if I didn’t think he’d land us swag. Congratulations to Aaron and all the winners:
Best Group Blog: The Edge of the American West
Best Individual Blog: Northwest History
Best New Blog: Wynken de Worde
Best Post: Claire Potter, Tenured Radical, “What Would Natalie Zemon Davis Do?”
Best Series of Posts: Tim Abbott on Trumbull’s The Death of General Montgomery, Jan. 12, Jan. 13, Jan. 14, Jan. 17, Jan. 18.
Wednesday, December 31, 2008
Happy Year of the Depend Adult Undergarment!
I pray neither you nor yours hear the squeak before having the opportunity to wear one for a few decades.
Friday, December 26, 2008
Harold Pinter, RIP
Harold Pinter—Undeserving Laureate of a Prize that Doesn’t Matter Anymore Because Who Still Reads Literature Anyway?—died yesterday after a long struggle with esophageal cancer. He will be missed.
Monday, December 22, 2008
A Pre-MLA Preview of the Annual Post-MLA Article
Every year more than 10,000 literature scholars gather at the end of December for the convention of the Modern Language Association, the 124th of which begins next week in San Fransisco.
Past conventions have yielded papers with titles that were rife with bad puns, cute pop-culture references and an adolescent preoccupation with sex, from “Neo-Victorian Buggery” to “Bambi as a Bottom” and the tragically hip “I Never Got Tenure (but I Owe My Job to Jay-Z): Capital-T Theory, Hip-Hop Culture, and Some Thoughts About the Role of Literature in Contemporary Literary Studies.”
Founded in 1883, the Modern Language Association barely registered on the public consciousness for its first century. Professors attended to doze through papers about Chaucer and Emerson, schmooze one another and lobby for posts at more prestigious campuses. But in the 1980’s the conference became the site of annual skirmishes between old-school traditionalists and the increasing powerful new breed of postmodernists, multiculturalists, feminists and queer-theory advocates.
Basking in this unaccustomed level of public notice, Modern Language Association scholars brought increasingly attention-grabbing papers to the convention through the 1990’s, “queering” the “canon,” some said, and championing the “postcolonial,” proposing wild theories about everything from comic books to hip-hop to television and movies. Last year, perhaps hoping to put a stop to the trend, the Chronicle of Higher Education announced its first Annual Awards for Self-Consciously Provocative M.L.A. Paper Titles (a k a the Provokies) but in 2004 the Chronicle decided to drop the awards. Scott McLemee, a senior writer at Inside Higher Ed, explained that “crafting titles to get them written about and attacked in the press used to be exciting.
“Now it’s become a reflex, and their hearts aren’t really in it anymore.”
Not only are titles no longer intended to amuse, from the looks of this year’s several thousand entries, absolutely nothing of any importance is studied by scholars who present at the MLA. From “‘Nabakov’s Self-Translations” to an entire panel devoted to African literature, these scholars embrace topics no right-thinking person cares about. Would Joe the Plumber attend a talk on “William Faulkner’s Rural Modernism”? Would Tito the Builder enjoy a twenty-minute talk on “History and Memory in [James Joyce’s] ‘The Dead’”? Does Joe Sixpack even know what PMLA is, much less want to be published in it? Why then would he attend the roundtable discussion “How to Get Published in PMLA“? While most Americans never bothered to acquaint themselves with old readings of Renaissance texts, the eggheads at the MLA insist on producing “New Readings of Renaissance Texts.”
And there’s much, much more. But all of it is about unimportant nonsense.
Sunday, December 21, 2008
Combobulated: Being a Play in Which We Laugh at Arrogant Undergraduates
(In a small classrroom, a young professor is discussing an R.P. Blackmur essay on Shakespeare’s sonnets with a group of twelve or so students.)
TEACHER: Blackmur claims “the hues attract, draw, steal men’s eyes, but penetrate, discombobolate, amaze the souls or psyches of women.” What does he mean by that?
TEACHER: Break his sentence down. What does “discombobulate” mean?
STUDENT #1: Bored?
TEACHER: So Shakespeare’s language penetrates the souls of women by boring them? (two engineering majors giggle) How do you amaze someone by boring them?
STUDENT #2: (confidently) It’s a technical term from Switzerland. Watchmakers call the tiny gears inside a watch “bobulates” (beaming) and what a watchmaker does is he brings the bobulates together, and “com” is the Latin for “together.” So the proper technical term for this watch here (points to his wrist), or any working watch, is to say it’s “combobulated.” But over the life of a watch, it gets knocked around, and the gears get unaligned, and when that happens the watch becomes “discombobulated.”
TEACHER: Not “disbobulated”?
STUDENT #2: That’s what I said, but he told me--
TEACHER: He who?
STUDENT #2: My rabbi.
TEACHER: I see.
STUDENT #2: He said the Swiss wouldn’t be taken seriously if they didn’t keep the Latin in there, because “bobulate” sounds silly enough without the Latin prefix.
TEACHER: Isn’t “dis” a Latin prefix?
STUDENT #2: I didn’t know that then.
TEACHER: So what do you think Blackmur meant?
STUDENT #2: ...?
I still don’t know what Blackmur meant--nor why my rabbi conspired with The Future to punk me--but as the MLA approaches, I’m increasingly convinced that the first time I ever spoke up in class foreshadowed some ominous end to my academic career.* So while I’m not exactly sure what end this start augurs, I take comfort in the fact that Dickens didn’t know what he’d foreshadowed for Pip when he wrote the first installment of Great Expectations.** (Or he wouldn’t have written two endings.)
*The other lesson? Never trust the Jews.
**Not that scholars have written much about this. The only exception I can think of is about Buffy--but that might be because I only dipped my toe in Dickensian waters. (Work on Wharton’s serialized novels focuses on how she altered the plot or how she mimicked James, so even though I should’ve encountered something about it researching my Wharton chapter, I didn’t.)
Tuesday, December 09, 2008
“If you liked Annie, you’ll love Rags to Riches."*
Today I learned that as often as I throw around the phrase “Horatio Alger novels,” I’d be hard-pressed to list many works that fit the bill. There are, of course, novels written by Horatio Alger, but even they only qualify on a technicality. (Plus, not all of Alger’s novels rely on pluck and luck to drive the narrative.) Not that his novels weren’t enormously popular, as over 17,000,000 million copies were sold in the 1860s through 1880s. Nor was their uplifting ideology incidental, as the popularity of C. B. Seymour’s Self-Made Man (1858) and Freeman Hunt’s Lives of American Merchants (1858) attest.
But as pervasive as Alger’s rags-to-riches ethos is assumed to be, I can’t think of many novels which present—much less endorse—it. Literary scholars prefer to toss off references to humble bootstrappers as if hundreds upon thousands of novels described their ascent up the social ladder. Maybe there are, but most encounters with the phrase “Horatio Alger novels” are purely contrary. We have the first fifteen chapters of every Jack London novel. We have Robert Penn Warren calling Theodore Dreiser’s The Financier an attempt to modernize the Horatio Alger myth (Homage 56). We have Richard Wright identifying it as the locus classicus of capitalist mystification in Black Boy:
I dreamed of going north and writing books, novels. The North symbolized to me all that I had not felt and seen; it had no relation whatever to what actually existed. Yet, by imagining a place where everything was possible, I kept hope alive in me. But where had I got this notion of doing something in the future, of going away from home and accomplishing something that would be recognized by others? I had, of course, read my Horatio Alger stories, my pulp stories, and I knew my Get-Rich-Quick Wallingford series from cover to cover, though I had sense enough not to hope to get rich; even to my naive imagination that possibility was too remote.” (161)
What we have isn’t a robust literary tradition of Horatio Alger-type novels so much as a steady stream of anti-Horatio Alger-type novels beginning with the last chapter of every London novel and Dreiser’s Trilogy of Desire (about Frank Algernon Cowperwood) and continuing through naturalist eviscerations of pat moral didacticism in Wright and Saul Bellow’s The Adventures of Augie March. Whither the tradition they counter? Wright’s autobiography and Bellow’s autobiographical novel clearly respond to the narrative form of Alger’s novels, but that brings us back to Alger as the only author proper to his category.
I’m tempted to say that this is yet another case of savvy critics trying to account for the popularity of atrocious novels by appealing to a grand ideology that predisposed a given audience to swallow shallow tripe—but then my student, the one who humbled me this afternoon, wouldn’t know the names of any earnest iterations of the Horatio Alger narrative. Any suggestions?
*Being the tagline of NBC’s ill-fated attempt to hijack Annie‘s bandwagon.