About Scott Eric Kaufman
Scott Eric Kaufman is an English graduate student at the University of California, Irvine. He earned his B.A. from Louisiana State University and hopes to earn his Ph.D. sometime before his funding evaporates. He had one fancy title: Senior Instructor of Literary Journalism. He is currently working tirelessly on his dissertation. His scholarly interests include everything—and he means everything—pertaining to American literature and appropriations of evolutionary theory c. 1890-1910. His blather can also be read on The Valve.
Posts by Scott Eric Kaufman
Monday, January 18, 2010
Don Draper is, of course, never himself.
Let me open with a quick clarification about the previous Mad Men post. As to the purview of self-fashioning, we all do it. In blog terms, you know me as this guy, i.e. the one who caught those students, made that other one extremely uncomfortable, is frequently victimized by the library, hid his cancer from his wife, etc. Those are the stories I tell about myself to explain myself to myself. To quote Gertrude Stein from Everybody’s Autobiography:
Identity is funny being yourself is funny as you are never yourself to yourself except as you remember yourself and then of course you do not believe yourself. That is really the trouble with an autobiography you do not of course really believe yourself why should you, you know so well so very well that it is not yourself, it could not be yourself because you cannot remember right and if you do remember right it does not sound right and of course it does not sound right because it is not right. You are of course never yourself.
The phrase “of course” captures the central irony of all self-fashioning: we know, of course, that we are more than the sum total of the stories we tell ourselves about ourselves, and yet we only understand ourselves, and can only be understood by others, through those stories. In case you ever wanted to know why narrative diversity is important, there you have it: the more narrative modes available, the more possible understandings of themselves the people who encounter them can have. This is self-fashioning at its most mundane, and in terms of Mad Men, this is why Peggy Olson becomes more modern: once she understands herself in terms of the upwardly mobile career-oriented woman, the audience understands her frustrations in terms of the conflict between that meritocratic fantasy and the realities of being a woman in a male-dominated working environment. She becomes more recognizably modern not because the world she inhabits does, but because the way she responds to that changing world elicits a chorus of “of courses.”Continue reading "Don Draper is, of course, never himself."
Friday, January 15, 2010
Don Draper as an unraptured Emma Bovary
(x-posted twice-over because this is the closest I’ve come to actual literary analysis, as opposed to comp-rhet material, in ages. As you might can tell, I’ve had some difficulties thinking of myself as a proper literary scholar of late, and have absented myself from these parts because of it. What can I say—other than when the profession refuses to treat you like what it trained you to be, you stop thinking of yourself in its terms. But enough of that. Here, have a post!)
As I noted in the comments to this post, it was only a matter of time before I started Mad Men; however, as I’ve studiously avoided reading about the show for the better part of two years now, I’m not sure my insights into it will be all that insightful. Still, I’ll soldier on, with the caveat that I’m about to watch the eighth episode of the most recent season and would rather not have it spoiled. Not, mind you, that I think it could be, as the one of the defining features of the show is the thundering predictability of its characters. That’s not as an indictment of Matt Weiner or his writing staff, merely an acknowledgment of the show’s central conceit: these are people who want to be left behind when the rest of the world is raptured by history—at least at first.Continue reading "Don Draper as an unraptured Emma Bovary"
Monday, December 21, 2009
‘Avatar’ and the War of Genres
(The following is a guest post by my friend Gerry Canavan, who had the insane notion that a literary scholar might want to tackle a cultural artifact on literary terms instead of, say, overtly political ones.)
I saw Avatar last night and I thought at first I didn’t have all that much to say about it. I was prepared to shamelessly steal a friend’s thesis that this is really all about video games, but I see Kotaku already did that. In the face of column after column centered around a nominally leftist reading of the film as Dances with Wolves… in Space!, SEK has already provided a more nuanced consideration of the films racial—and racist—dimensions. Posts about the backlash of the backlash and the backlash of the backlash of the backlash have already been taken care of.
I’m not all that interested in the special effects, which, perhaps due to some projection issues in our theater, didn’t seem to be quite as spellbinding as advertised. The language stuff interests me more, but seems ultimately somewhat empty. “And congratulations to Cameron for taking us from a figuratively to a literally inhuman standard of slenderness for women" seems to cover it with regards to feminist critique of the Barbie-doll-shaped Na’vi.
The religious element, while not especially original, is, from a materialist standpoint, pretty deeply problematic, and badly damages the film’s ecological politics, which frankly are not all that well thought-out in the first place.
But in the theater and as I sat down to write this post I mostly found myself preoccupied with the genre question. I don’t want to recapitulate the genre post I wrote for Infinite Summer, but in brief this is how Darko Suvin approaches SF:Continue reading "‘Avatar’ and the War of Genres"
Monday, November 30, 2009
Some novel called Yellow Blue Tibia or somesuch.
Let me begin by agreeing with Kim Stanley Robinson:
[T]his year the [Booker] prize should probably go to a science fiction comedy called Yellow Blue Tibia, by Adam Roberts.
I say this not because Adam’s a personal friend (although he is), and not because I’ve edited some of his other novels (although I have), but because it actually is the most intriguing novel I’ve read this year. Admittedly, I can’t say whether it’s the best novel published in 2009, because I only read three novels published this year (The City and the City, Inherent Vice, and Asterios Polyp), so I’m limited to saying that Yellow Blue Tibia merely outpaces the latest by Mieville and Pynchon, as well as David Mazzucchelli’s decade-in-the-making masterpiece. A quick plot summary before moving on to what makes the book sing.
In 1946, Josef Stalin ordered Konstantin Skvorecky, Ivan Frenkel, and a few other Russian science fiction writers to create a new threat against which the Soviet people could unite (as they had against Germany). They concoct a plot in which invisible radiation aliens invade the U.S.S.R., but it opens when “The Americans launch a rocket to explore space [and the] aliens destroy it with a beam of focused destructive radiation ... Then the aliens blow up a portion of the Ukraine, and poison the ground with radiation” (25). Before they can sketch the invasion out in greater detail, Stalin disbands the group. Years pass. Frenkel accidentally reconnects with Skvorecky shortly before the Challenger disaster. The plan they concocted for Stalin seems to be coming true. Skvorecky, a translator, meets two American scientologists and a Muscovite taxi driver named Ivan Saltykov. There is a murder. Someone or something threatens Chernobyl. Love happens.
That is not, I grant, the most conventional summary of the novel—if they’re more your bag you can try here, here or here, or if you’re feeling more adventurous, here—but for me to say more would not despoil the novel so much as ruin the pleasure afforded by Adam’s narrative gamesmanship. I’m more than happy to spoil a simple plot point, but I would prefer to avoid ruining the interpretive tension created by the contradictory accounts of those simple plot points. Were I to concretize any one of them, I would not only be usurping the role of a character within the novel, I would be reproducing the book’s ingenious structural conceit.
Unlike a A Scanner Darkly, in which conflicting realities are focalized through the muddle of drug-induced paranoia, the narrator of Yellow Blue Tibia is fully aware that he lives in a world structured by other people’s understanding of reality. From the obsessive-compulsive taxi driver, Ivan Saltykov, who returns to the scene of the crime because he must retrace his path exactly, to the UFO enthusiasts who mistake Skvorecky’s denial of the existence of extraterrestrials for an exercise in dialectical thinking, the characters in the novel influence the narrative less through their actions than their rationale for engaging in them. Dramatic irony is both deployed and undermined, resulting in a comedy of ideological errors that ranges from the subversively slapstick (Krapp’s Last Tape as performed by an inept Moscow detective) to the deeply structural (the evisceration of Scientology’s theoretical and psychology underpinnings).
But, because I’m a blogger and bloggers are narcissists, I want to call attention to the rude portrait Adam drew of me in the novel. In a comment attached to a post from 2005 that has since been rescinded—it was a little too revealing about someone in my department and thus fell into the category of material I wrote as “A. Cephalous” that’s not suitable for publication under my own name—Adam posted a link to what he referred to as “a portrait of Mr. Non-Capo.” Five years later, he included in Yellow Blue Tibia the following:
The lift door creaked open, and a fantastically shrunken and wrinkled old woman shuffled out, carrying a string bag bulging with provisions. Her head was located in the space directly in front of her torso, as if her neck fitted into the centre of her sternum rather than between her shoulders[.]
Make of that what you will. I choose to be offended. Offended!
Friday, November 06, 2009
In that moment, I knew I’d be accused of sexual harassment again.
Two of the “acquire an alternative skill set with real world application” assignments that we teach in composition are 1) how to build and manage a wiki and 2) how to compose a PowerPoint presentation that doesn’t cause your audience to slit your throat or their wrists. I combine them into a single assignment in which students choose the text they found most compelling, develop a wiki based upon its rhetorical situation (author/auteur, historical context, themes, signature features, symbols and motifs, etc.) and then share their results with the class. On Tuesday, I stressed that their presentation must not consist of reading their wiki aloud (by virtue of emphasizing the difference in media, e.g. “How do you speak a link?") and we discussed strategies they can employ to prevent us from mass-suiciding on Thursday.
Yesterday, midway through an already engaging presentation, one of my students paused during her discussion of the contextual allusions present in her text.
“Also," she said as she made to forward her presentation, “I think there’s an allusion to tentacle porn."
The class gasped.
Her mouse clicked.
I sat mute. Horrified into silence.
Time dilated as we approached the horizon of this career-ending event. I held that diphthong in “Wait!” so long it slid into a schwa.
Her mouse clicked again.
The screen brightened and . . .
Wednesday, September 23, 2009
A New Literary History of the United States in Literature
The publication of A New Literary History of the United States will likely strike a few chords familiar to the participants in the debate that followed Rohan’s latest post. Written neither in the Emory Elliot mode—a history of items both literary and American—nor that of Sacvan Bercovitch—a history all items written by Americans that can be yours for the low, low price of $299.29—editors Greil Marcus and Werner Sollors instead decided to write a cultural history of the United States in a self-consciously literary voice. As Laura Miller at Salon* writes, the two
have pitched the biggest tent conceivable, pegging each of the chronologically arranged essays in the book to “points in time and imagination where something changed: when a new idea or a new form came into being, when new questions were raised, when what before seemed impossible came to seem necessary or inevitable.” With this in mind, they’ve produced a compendium that is neither reference nor criticism, neither history nor treatise, but a genre-defying, transcendent fusion of them all. It sounds impossible, but the result seems both inevitable and necessary and profoundly welcome, too.
This is, then, an anthology seemingly written to drive J.C. Hallman to drink, because it doggedly focuses on cultural significance over the literariness of the literary. However soul-deadening he might consider its subject matter, the manner in which most of it is written would likely meet with approval. Though idea-driven, the prose in Jonathan Lethem’s entry on Thomas Edison—in which he exclusively discusses the inventor’s place in film history—still sings:Continue reading "A New Literary History of the United States in Literature"
Monday, September 21, 2009
On occasion, it’s an honor not even to be nominated:
[Kim Stanley Robinson] believes this year’s [Man Booker] prize should go to Adam Roberts’s science fiction comedy, Yellow Blue Tibia, which didn’t even make the longlist.
This comes not two weeks after an appearance on Neil Gaiman’s bookshelf. I’d like to say I’m not jealous—but I’d be lying.
Friday, September 04, 2009
Historical novels, underrated or no, are only ever incidentally historical.
In the comments to Eric Rauchway’s post about underrated historical novels, I pointed out that there is a problem with talking about the “historical novel” as a self-evident genre. I did not, however, go into much detail as to why, because I covered the topic on my qualifying exams and the less said about that experience the better. But since Eric asked so nicely, I will oblige and show you why this discussion’s so painfully tangled.
Short version: Its knots all sport thorns.
Long pedantic version:Continue reading "Historical novels, underrated or no, are only ever incidentally historical."
Tuesday, July 21, 2009
Thanks to Adam Roberts, even academics writing books about comics are insufferable elitists.
Oh, I see. Popular is bad. Common feeling among self-defined elites.
Oh, I see, you’re saying that anyone who likes SF is stupid.
Those words were written in response to a post that argues:
[T]he very heart’s-blood of literature is to draw people out of their comfort zone; to challenge and stimulate them, to wake and shake them; to present them with the new, and the unnerving, and the mind-blowing. And if this true of literature, it is doubly or trebly true of science fiction. For what is the point of science fiction if not to articulate the new, the wondrous, the mindblowing and the strange?
I would frame that argument differently: when I read science fiction, I want to replicate the wonder my nine-year-old self experienced when he first read Frederik Pohl’s Gateway. I had never considered the possibility that the universe might be littered with the archeological remains of civilizations snuffed out before the proto-pre-dawn of human history. The thought of it was so sublime that, a decade later, I watched five seasons of Babylon 5 trying to recapture it. Not that I’ve stopped, mind you, but when you consider the sheer volume of science fiction I’ve consumed in the twenty years since I read Gateway, I think you can see why that experience is increasingly illusive: more often than not, what I read contains ideas I’ve already encountered, so the only avenue to awe is through the quality of execution. There are exceptions—Perdido Street Station being the one example, Adam’s conceptually audacious novels being another—but they merely apply meat to Adam’s claim that the nominees for the 2009 Hugo Awards fail to engender what proper science fiction should; namely, Schopenhauer’s sublime:Continue reading "Thanks to Adam Roberts, even academics writing books about comics are insufferable elitists."
Saturday, July 18, 2009
Superior adaptations of inferior novels (Harry Potter and the Half-Blood Prince)
Reviewing a film based on a book you haven’t read is always a dicey proposition—you likely missed or misread the winks and nods aimed at the readers surrounding you—but I think an exception can be made in the case of a film that works because you haven’t read the source material. So I begin with an admission: I can’t read the Harry Potter novels. I got through 100 pages of the first three and stopped once I realized that they are, on the whole, terrible; and that when they rise to the level of unsubtle Dickensian grotesquerie (minus the wit), they’re merely awful. But I mostly enjoy the films, which dispense the requisite infodumps in digestable bits and—by virtue of being films—relieve J.K. Rowling of the burden of pretending to be Mervyn Peake.
The best of them is Alfonso Cuarón’s Harry Potter and the Prisoner of Azkaban, but that has more to do with having Cuarón at helm than the quality of the source material, as his filmography consists of superior treatments of the same narratives at play in Rowling’s novels: a tale set in a strict boarding house during a period of great struggle (A Little Princess), about an orphan with an unknown destiny and mysterious benefactors (Great Expectations) who comes of age sexually (Y Tu Mama Tambien) as the fate of the human race is being decided (Children of Men). But even with Cuarón behind camera, the film felt forced—as if the removal of Rowling’s expository indulgence required a labor so onerous, evidence of it clung to the film like slight stains in the pits of an otherwise smart shirt.
The same cannot be said of the latest film, Harry Potter and the Half-Blood Prince, in which the narrative excess of the novels becomes a matter of allusion over excision. The result? The first film in the series to have the effect the novels are wrongly purported to: it presents an unnerving and captivating account of a world and moment the audience can’t fully fathom. The confusion was compelling: I was drawn into situations whose meaning escaped me, but whose significance was clear, and so I spent the entire film intellectually engaged. In the previous films, all the guns belonged to Chekov, and you appreciated the arrangement of the firing squad or you didn’t—but in either case, you knew which guns would be fired because the overwhelmed screenwriter removed anything that might be mistaken for a decoy.
The earlier films never alluded; they either explicated at length or vehemently pointed at the mystery the movie would explain. In The Half-Blood Prince, David Yates includes scenes whose importance is not established by the mere fact of their inclusion. The narrative wanders, forcing the audience to debate which of the various elements will ultimately be meaningful. Will it be the stroking and stoking of Ron’s ego? The development of Harry and Hermoine’s increasingly complex friendship? The pangs of conscience Draco Malfoy felt upon murdering a bird? Or will it be one of the many other expertly-acted, deftly-directed scenes in which, for the first time, everyone not named Alan Rickman seemed comfortable in their character’s skin? The narrative ambiguity, coupled with a pace that allowed scenes to develop such that motivations were intimated rather than immediately revealed, resulted in a film that was strikingly adult in weight and complexity, to which American critics responded by saying:
A giant two-and-a-half-hour YAAAWN.
This movie went on and on and on and on and on.
Not only did all of those sentences appear in a single review, they appeared in three consecutive sentences:
Boooooooooring. A giant two-and-a-half-hour YAAAWN. This movie went on and on and on and on and on.
Granted, that was Debbie Schlussel, who time and again has proven her intelligence to be inversely proportional to her estimation of it. But she’s not alone. Rex Reed, the very bellwether of popular American opinion on film, thinks that
the sixth and worst installment is two and a half hours of paralyzing tedium, featuring another colossal waste of British talent and a plot a real witch couldn’t find with a crystal ball. The kids at Hogwarts no longer have any relevance. They have never heard of iPods, cell phones or the Internet.
He complains about the gorgeous, subdued cinematography of Bruno Delbonnel, who previously worked with the notoriously muted Jean-Pierre Jeunet on visually uninspired films like Amélie and A Very Long Engagement, and is the second untalented cinematographer in a row David Yates has chosen to work alongside—the first being Slawomir Idziak, a Polish hack so dumb his director, Krzysztof Kieslowski, had to name his films after the color of the desired palette lest Idziak murk them up. I daresay that anyone who complains about the cinematography of The Half-Blood Prince knows nothing about the medium he or she is paid to write about in, say, the Wall Street Journal:
[The film] may suffer by comparison to visual memories of the first film, which wasn’t all that wonderful but teemed with wondrous images.
The film was so dull and so visually unstimulating that, at the New York Times, Manohla Dargis forgot how families work, writing:
The chosen one, Harry has been commissioned to destroy the too-little-seen evildoer Voldemort, a sluglike ghoul usually played by Ralph Fiennes (alas, seen only briefly this time out) and here played, in his early embodied form as Tom Riddle, by the excellent young actors Hero Fiennes Tiffin and Frank Dillane. There must be a factory where the British mint their acting royalty: Hero, who plays the dark lord as a spectrally pale, creepy child of 11, is Ralph Fiennes’s nephew[.]
Ms. Dargis, if I may, you answered your question about this hypothetical actor factory in the previous sentence. It’s called “the Fiennes family,” and its existence has been known about for the better part of two decades. It seems that when confronted with anything resembling complexity, the popular American critical establishment falls asleep—by which I mean, they reveal themselves to be whatever it is one becomes after spending a lifetime trying to catch up to the lowest common denominator.
If I were a more generous person, I’d note that the reason these critics were bored by the film was because they knew what would happened—it being the plot of the only book they read that year—and wanted the film to get on with it already. Dargis as much as admits exactly that: “[T]he lag time between the final books and the movies has drained much of the urgency from this screen adaptation, which, far more than any of the previous films, comes across as an afterthought.” This impatience with development is a symptom of a collective addiction to novelty in American culture, one which results not only the unwillingness to glory in a studied presentation of the end of adolescence, but in the elevation of incurious, anti-intellectual populists like Sarah Palin to national prominence. Meanwhile, across the pond . . .
Wednesday, July 08, 2009
The telos of the back cover
I can imagine no more frustrating a reading experience than the one I just had with Iain M. Banks' Excession. Is it a great novel?
I don't know.
Is it a good novel?
I don't know.
Why don't I know?
Because I didn't—because I couldn't—read the novel on its own terms. I spent the entire time awaiting the arrival of a plot that never materialized. Why did I do that?
Because of the back cover:Continue reading "The telos of the back cover"
Tuesday, June 30, 2009
Infinite Summer: Morbid? Culturally Imperial? Morbidly Culturally Imperial?
Am I alone in finding the whole idea of Infinite Summer a little morbid? The renewed interest in David Foster Wallace’s Infinite Jest is an obvious Good Thing—a first step toward popular as well as academic canonization—but having lived through the recent Michael Jackson Media Event, I can’t help but wonder whether the desire to read Wallace’s novel is akin downloading Thriller because Some Important Someone died. Do I sound like I’m thwacking some straw man with shovel? Because I’m not:
I have a confession to make. I don’t even like David Foster Wallace. And I don’t mean that I found Infinite Jest too lengthy on the first run-through. I mean his accessible stuff. His tales from cruise ships and lobster festivals and tennis matches and radio studios . . . So why am I here?
The short answer is that David Foster Wallace died.
That’s Ezra Klein, writing at A Supposedly Fun Blog. I’m not complaining because famous bloggers (Matthew Yglesias and Julian Sanchez among them) are horning in on my territory—although I will note that the first thing I ever published online was a mediocre seminar paper titled “Demand and the Appearance of Freedom: The Role of Corporate Media in David Foster Wallace’s Infinite Jest,” but only just to note it—nor, despite the above, am I really even complaining that Klein’s interest was piqued by Foster Wallace’s suicide, as a more charitable excerpt shows his interest to be far less morbid:
The slightly longer answer is that David Foster Wallace died and I cared. That was, to me, a surprise. Lots of people die. Just the other day, Ed McMahon died. It hardly registered. But Wallace was different. I read everything I could about his final days. I posted a memoriam on my site. I watched readings on YouTube. It affected me. I don’t know if it’s because he was a young writer who was felled by the violent bubble and froth of his own mind and that a small part of me relates to that. I don’t know if it’s because he was, in some way, unique to my generation, and as such, one of my own.
In the end, what’s interesting about the 25-year-old Klein’s post about the 46-year-old Foster Wallace’s novel is the notion that someone who was 18 years old when the Clash first performed in America and someone who was 18 years old the year Joe Strummer died can be said to belong to the same generation. How does that work? I’m tempted to blame it on the Internet:
Once you could identify someone’s taste by the cut of their concert tee—London Calling vs. Combat Rock, The Clash vs. Operation Ivy, Operation Ivy vs. Rancid, &c.—now that all these these bands (mostly) belong to the past tense, they’re part of that enormous cultural pool from which more recent generations sample freely. For example, someone Klein’s age will never experience the pain of the endless, fruitless search for something like the first Clash album (which, contrary to that link, has not been in print continuously since 1979), as CDNOW was in decline during his formative years. To people for whom almost everything has always been immediately available, the idea of what constitutes a culturally-determined generation is bound to be a little fuzzy.
Note that I’m not criticizing Klein for being born in a time of cultural plenty—I would rather not have spent the better part of a decade searching for this in vain—I’m merely pointing out that his inclusion of Foster Wallace among his contemporaries dumbfounds me . . . unless I chalk it up to the novel instead of the man. Wallace might not be Klein’s contemporary, but Infinite Jest could be. Now that I’m reading it again, I’m struck by how contemporary it feels. Everything that annoyed me about it in 1996 still annoys me now—the footnotes, subsidized time, the too-frequent self-indulgent sentence—but everything that felt new in 1996 still feels new now.
Given how we imagine ourselves into an intimacy with our favorite authors, it makes sense for people twenty-five years younger than Foster Wallace to feel a generational affinity for him on the basis of his novel; but that doesn’t really work, now does it? I mean in the academic sense, the means by which we identify Author X as belonging to Period Y and analyze his or her work in light of the aesthetic of Period Y. We don’t, in other words, seriously consider historical feelings of contemporaneity the way we experience our own, inasmuch as I’m fourteen years younger than Foster Wallace but, like Klein, count him as “one of my own.”
Friday, May 29, 2009
What I can (and can’t) say about Jenny Davidson’s Breeding.
They say that when you’re writing a dissertation, every cultural artifact you consume because grist in its conceptual mill—and they are correct. Because when you’re writing a dissertation, everything seems relevant. So even though I’m courting cliché by saying it, I’m going to say it anyway: everything in Jenny Davidson’s Breeding seems relevant to my research. Why?
Because it is.
For those who only know me as the guy who does those posts on film and comic pedagogical strategies, behold my credentials. Why am I talking about myself instead of Jenny’s book? Because understanding my one quibble with her argument requires you understand something of mine.
The short—and I mean it—version is that non-Darwinian theories of evolutionary and social development survived in and were desseminated by works of literature irrespective of their status in the scientific community. You can see why I might consider Jenny’s book (published by Columbia University Press) a prequel to my dissertation (available for download via a database very few people can access). Given that my dissertation focuses on these debates raged ninety-five years after Jenny’s century ended, I’m not really qualified to speak to—much less certify—the validity of her evidence. But to address John’s concerns, I can say that every time she recounts a theory or debate I’m familiar with, she does so in a way no charitable reader would find fault with.
That’s not to say that I always agree with her, if only because she often suggests points I want forcefully asserted. Her reluctance to do so may merely be rhetorical: in a book that lets primary works speak for themselves, forceful assertion might not seem simply out of place, it could result in the casting of suspicion on the curiously adament claim. For example, she concludes her discussion on resemblance in Elizabeth Inchbald’s A Simple Story by claiming:
When the old problems [relating to the heredity basis of resemblance] resurface late in the nineteenth century, most scientists are ignorant of the earlier theories: Thomas Huxley is an exception, as are a few others, but Darwin knew little of nothing of the seventeenth- and eighteenth century controversies alluded to here. Literary texts, though, retained a palimpsest of these arguments, providing one means by which Darwin and others could gain access to the knowledge of earlier generations. (36)
This would be the weak version of an argument whose strong version would look something like this: given that Jenny earlier indicated that Darwin read Inchbald’s A Simple Story alongside Jane Austen’s Sense and Sensibility and Mansfield Park in 1840, why not try and verify whether Darwin recognized the palimpsest as such and took something from it? Because a quick search of his notebooks reveals there might be something to it.
In 1840, Darwin was writing “Old & Useless notes about the moral sense & some metaphysical points.” It contains notes on books like Louis-Aimé Martin’s De l’éducation des mères de famille (1837) like “I suspect conscience, an heredetary [sic] compound passion, like avarice” (601), which in essence means Darwin was investigating whether greedy children resembled their greedy mothers. It stands to reason that Austen might have something to say about that.
But I would say that. Eighty percent of my dissertation involves sussing out just those sorts of connections: the lines of argumentation—some acknolwedged, most not due to the arguer being unaware of their continued influence on his or her thought—that persist, despite scientific progress, largely on account of their presence in popular literary culture. In short, my complaint is that Jenny didn’t write the book I would have written, which as complaints go is fairly universal. But to return to John’s qualm, the fact that the claim she suggested might bear fruit seems like it will indicates that our trust in her is not misplaced. Whether this is because the discipline has done it job—rewarded a scholar whose knowledge of the field is such that her suppositions are more likely to bear out than not—I can’t say.
Thursday, May 28, 2009
Some Methods of Breeding
This is a guest post by David Mazella, an Associate Professor in the Department of English at the University of Houston, and a co-founder and managing editor of the scholarly blog, The Long Eighteenth. He is the author of a cultural and conceptual history of cynicism, The Making of Modern Cynicism (University of Virginia Press, 2007).
I’m going to follow Jenny Davidson’s lead, and offer a “partial”* criticism of this remarkable book, which is, after all, subtitled a “partial history of the eighteenth century.” And for those puzzled by the precise meaning of “partial” on the title-page, Davidson glosses the term in her Introduction, where she justifies her own critical approach with the figure of Austen’s “partial, prejudiced, and ignorant Historian.” With this nod towards Austen’s extraordinarily concentrated narration, Davidson hints that this book, like Austen’s, will abjure the usual, chock-a-block style of academic narrative, and cultivate instead a listening-pose, in which she hopes to overhear the “echoes and responses and recapitulations [that] emerge from a congeries of voices” (12).
Consequently, this book is conceived as a “nuance exercise” (11) that is linked with the characteristic strengths of both historical scholarship in the humanities and literary studies in its Barthesian, writerly mode. In its close attention to the nuances of language, rhetoric, and historical change, this book opposes itself to the broader, more continuously narrated accounts of the nature/culture divide found in the history of science, cultural studies and critical theory. Whether she has left these rival accounts behind, however, or simply swerved around them, remains to be seen.
Nonetheless, in the spirit of Austen’s devotion to the single detail that has the potential to tell the whole story, we should think further about this “micronetwork” or sequence of terms, and consider how they might apply, ironically or not, to Davidson’s own project. “Partial” seems an apt way to describe her fondness for the literary, philosophical, and scientific writers she handles with such deftness and care. “Prejudiced” might be the term you’d want to apply to this book, if you were interested in finding more critical treatment of these writers. Ignorant? Not a chance. In a book that repeatedly revisits the role of “selection” in a variety of natural and cultural contexts, Davidson seems hyper-aware of the manifold resemblances and potential filiations of the writings she describes. So it seems best to assume that any omissions here are strategic, part of the way that she cuts rapidly from one scene to the next, as a “partial historian” who can afford to leave things unsaid.
The disciplinary priorities, then, of this “partial historian” are fairly clear, and fairly lopsided. She will subordinate the literal to the figurative, the scientific to the humanist, the argumentative to the narrative, the historical to the literary, and the whole to its parts. She argues, for example,
There’s something to be said for the worm’s-eye view, and I have more or less deliberately adopted the trope of synecdoche—taking the part for the whole, operating by means of contiguity and association—over the more accepted modes of analogy and argument, though I will pay my courtesies (to borrow an eighteenth-century image) to those interpretive modes. (12)
The elegance of this passage, its figurative brilliance and its deliberate echoes of eighteenth-century polite usage, however, put me in mind of another voice, that of the late historian E.P. Thompson, who once observed, “no one is more susceptible to the charms of the gentry’s life than the historian of the eighteenth century . . . . The historian can easily identify with his [sic] sources: he sees himself riding to hounds, or attending Quarter Sessions, or (if he is less ambitious) he sees himself as at least seated at Parson Woodforde’s groaning table” (17). Though I do not think that Breeding ever falls into this kind of morally complacent identification with its sources, this seems to me like a real danger with its self-consciously literary approach. I’m curious whether other readers (or Davidson herself) would be interested in discussing this issue of writerly identification and the ideological operations that dictate their own logic of “parts for the whole.” And isn’t one of the points of an interdisciplinary approach is that it cuts across the fantasies of disciplinary self-sufficiency offered by, say, literature on its own?
The presence/absence of Thompson in this book also made me wonder about the function of Raymond Williams, and more generally social history, in its historical frameworks, which seem largely tacit, but which pop in from time to time to do their explanatory duty (see, for example, p. 33). I’m familiar with this problem of assigning historical causes to longer-term semantic shifts, because I faced a similar problem myself in my discussions of cynicism’s historical evolution, but I was also curious whether others felt that her intermittent references towards, e.g., broad social changes” (33) provided as much explanatory force as they were supposed to?
So those would be the questions I’d put to the other readers (and Davidson herself): how does this book’s literariness, its determination to resemble its literary parents, affect its view of its subject-matter, especially when it seems intent on ventriloquizing eighteenth-century voices and attitudes? And how do social history, and broader issues of collective linguistic usage, fit into an historical account that focuses primarily on individual, literary examples?
*I’m using “partial” in the OED’s senses of “favourably disposed, sympathetic”)
Tuesday, May 26, 2009
On the form of Jenny Davidson’s Breeding
Odd as it may seem, I want to kick off this book event not by discussing the book’s argument—I’ll address that on Friday—but by focusing on its form. Consider page 44:
The print’s too small to read there, but you don’t need to be able to read it to understand what’s so unusual about Davidson’s argument: the trailing paragraph from 43 consists entirely of a quotation and is followed by a block quotation. In fact, 301 of the 409 words on that page belong to Locke, which means that the “[s]tory-telling of the kind [Jenny does] in this book” (41) is largely done by other people. In allowing the subjects of her analysis to define their terms at such length, she cedes the voice of the book to her interlocutors, which makes for an odd, yet somehow familiar, reading experience.
Rarely do you finish a secondary work feeling like you read the primary sources, but that’s precisely the impression created by Breeding. It took me a long while to realize why Jenny’s long citations were both familiar and compelling, but I finally did: Breeding is less like a scholarly monography and more like John McPhee’s Annals of the Former World. I confess that finding in an academic book qualities similar to those of one of your favorite books sounds suspiciously like discovering the germ of your dissertation in a Disney cartoon, in that you tend to see what you’re looking for when you look for it. But I really believe the analogy holds. McPhee drove back and forth across the country alongside the brightest geological minds in order to tell the story of how America came to look like America, and he let the monologues of his companions dominate his book; similarly, Jenny and her interlocutors guide us through the 18th Century, and she allows voices of her companions to dominate her book. In short, both provide sharp analysis under the guise of judicious narration.
Given the premium placed on demonstrations of original thought—be it at conferences or in articles and books—her decision to use her book as a vehicle to tell other people’s story seems like an unnecessary risk, but as Breeding demonstrates, it’s one more of us should consider taking.