.. and is it a good witch? Or a bad witch?
I spend a lot of time talking to people – convincing people – wooing people to consider digital modes and methods when it comes to research and teaching. I’m happy doing this, not only because it’s my job, but because I (and excuse the goofy foot here) these things are fun for me, and I want to share that fun. Not the most serious, scholarly articulation, but those of you who know me know that I am a nerdy, geeky, goofball.
Invariably in these discussions I find myself saying something to the effect that, “I know there’s a lot of pressure to engage digitally now, but you should feel empowered to push back against that expectation when and if it doesn’t do what you want it to.” And I’ve believed myself when I said that. I don’t mean this in the sense that some folks will just never take the blue pill … or is it the red pill? I forget. I meant that digital engagement doesn’t always give you what you need. FYI: I’m talking about in-depth engagement – not Twitter use in the classroom. There really are times when good old-fashioned paper and pencil give you what you need. When I taught the Digital Pedagogy workshop at Lewis & Clark in August I convinced a History professor (and myself) that despite his best attempts to convert an assignment from hand-drawing on a paper map to working with Google Maps, the best way for his students to think about cartography and the way in which we consider historical events in relation to topographical detail WAS to have them draw on a map. I think I surprised him a bit that I didn’t jump up and down and shout “GIS” at him.
And therein lies my struggle. What am I actually positing when I talk to people and present and urge and cajole that The Digital will give them some way in to their work that they might not have considered before? Despite exceptions such as the one above, I’m actually telling them that while it’s ok for them to feel they can push back, it’s really a shame if they do because digital modes and methods are sooooo much better.
I suspect many of us who have been working in DH for a while do this. We spent so long being on the outside, marginalized, trying to convince anyone who would listen that what we do matters and that it is meaningful scholarship; now people are paying attention (don’t look now but they’re looking at us!) we can’t quite shake that need to justify, to foreground, to compare, to privilege. We still dread the comment, ‘Oh, what a pretty picture – but what does that have to do with [..]?’ We can whoop it up and say, ‘Of course we can prove that [..] All we need to do is throw the entire corpus of everything ever written by [..] and we get: Ta-da!’ But why do we need to? And what does it get us when we say such a thing?
For my job talk at Bucknell I crafted a definition for digital scholarship: “humanities, social and natural sciences research that is enhanced, extended, or reconsidered through application of technology. This application can take the form of accessing or building tools and platforms or connecting with collaborators through online networks.” I also said this: “I do not believe that the Digital Humanities can “save” the humanities. I don’t think it needs saving. If anything, I see DH as a means by which the humanities can reinforce their vital importance to society.” And explained why I had chosen three demonstrative digital humanities projects upon which to base my talk as being “founded in traditional modes of scholarship and [have] embraced digital and online technologies to “better” examine and disseminate information; each emphasizes cross- or trans-disciplinary collaboration; and each already incorporates opportunities for in-class application as teaching tools and as means by which undergraduates can participate in real, meaningful research alongside their professors. I use the word “better” cautiously and purposely – I use it to indicate approaches that extend the effectiveness of research. I do not believe that digital modes of research fix something that is broken, or create something from nothing. The best, most effective research projects in the Digital Humanities are those that have been developed from solid traditional research questions by scholars who want to push examination of those questions with fresh eyes and through new perspectives.” Someone must have liked what I said. I got the job.
I was being hypocritical. While I employed rhetorical devices designed to reassure, sooth, appease my audience, I was actually pushing forward my agenda that, in fact, The Digital DOES make things better and stronger and smarter. I was establishing a position whereby the digital (and therefore me and my work and that of my DH colleagues) is exceptional.
So what do I mean by exceptionalism? I first experimented with the term when I was teaching a literature survey course on Youth & Adolescence at Waterloo. In lecturing to students on the characters we were reading about (Evey, Homer, Hamlet, Huck, Lyra, Marjane)1 they should consider how those characters might be perceived on a spectrum that involved considerations of agency, maturity, self-confidence, leadership status, journey, etc. etc. etc. That those characters should be compared with others because of their exemplary nature along that spectrum. (I also tried to explain Freytag’s analysis of dramatic structure in terms of a roller coaster ride – oh, to be a grad student instructor again …) I thought the introduction of exceptionalism would be a throw-away lecture point (probably because I was trying to move on to my roller coaster analogy) but the concept really stuck with my students and it kept resonating and resonating in class discussion and essays.
The terms I considered – agency, maturity, self-confidence, leadership status, journey – could in some way or another be applied to our experience as digital humanists. We really are at a point, finally, where we have the agency to take on leadership status in departments and on campuses.
But the kernel of my adoption of the term exceptionalism in the classroom was, if I’m honest, an exercise in stretching the definition as it was then being used by NeoCons to push forward an agenda of American Exceptionalism to justify specific foreign policy objectives (this would have been around 2006, so you can see the context). I’ve gone back and looked up the term “exceptionalism” in the OED, and the ONLY definition offered is this: “The theory that the peaceful capitalism of the United States constitutes an exception to the general economic laws governing national historical development, and esp. to the Marxist law of the inevitability of violent class warfare; more generally, the belief that something is exceptional in relation to others of the same kind; loosely, exceptional quality or character.”2 I cringe.
Nota bene: I Googled “digital exceptionalism” at approximately 11:20 a.m. 2013-10-06. 243 results presented “digital exceptionalism” but most propose a digital or internet exceptionalism in terms of reading practices, media theory, or game studies.3
So what does it mean if I embrace a position of Digital Exceptionalism? How am I perceived by humanities faculty (and more broadly by social scientists and natural scientists at Bucknell) when I present what I see as opportunities for engagement? I think I’m a good witch.4 I’m a nice person. I’m engaging. Friendly. Enthusiastic. I defuse tense situations with self-deprecating humor. But do they see me as a threat of some kind? Do I, for all my talk about opt-in engagement, represent something insidious? That, in fact, if they do not adopt digital modes and methods in their research and teaching they are weakened? That was not my intention, but I can certainly see that concern. What does academic etiquette dictate? Should I apologize for something about which I am not apologetic?
So if, as I reflect, I do represent as a digital exceptionalist, what should I do about it? As I write this I’m thinking in some compartmentalized bit of my brain about a presentation I’ll be making to the English department on Tuesday, demonstrating some tools and projects that they might find intriguing and worth further discussion. How can I now not be hyper-aware of this position and what it will mean for that presentation? I think it is best to be honest and transparent. I am there to woo. Bucknell is incorporating “digital scholarship” as intrinsic to its capital campaign initiatives.5 It’s in the English department’s best interest to be on that train. But can I – now that I’ve invoked the NeoCon – disabuse them/you of its a priori encumbrance?
Ok. I need to go do some text-encoding and map massaging. I’ll be wondering if this is all recursive consideration. Even if the term has not been used before, who among us has been contemplating and addressing and hand-wringing over such matters? Please point me at them, so that we can discuss this further, or they can disabuse me of my assumptions.
I’ll let you know how Tuesday’s presentation goes.
- for those of you playing the home game, V for Vendetta, Cider House Rules, The Tragedy of Hamlet, Huckleberry Finn, The Golden Compass, Persepolis ↩
- “exceptionalism, n.”. OED Online. September 2013. Oxford University Press. 6 October 2013 <http://www.oed.com/view/Entry/242836?redirectedFrom=exceptionalism&>. ↩
- See, by example, Tim Wu, “Is Internet Exceptionalism Dead“; Richard Huskey’s blog post “Link List – Games, Cognition, & Virtuality“ (that, in turn, offers an embedded video of James Bridle’s intriguing 2011 Lift conference talk “We Fell in Love in Coded Space“; and a curious stub page on Caslon Analytic’s cyberspace myths page. Others appear to lead to pages on American Exceptionalism that include the word “digital” ↩
- I played Glinda in a community production of “The Wizard of Oz” in 1975, so I can identify as such. ↩
- placed in quotation marks because it is developing a more specific definition here as being rooted in faculty-student interaction. ↩