When I began teaching at Brunel University London five years ago I’d had little but glancing contact with the academy since I left higher education myself. My experience of reading for a politics, philosophy and economics degree at the University of Oxford in the late 1970s and early 1980s, may, even for the time, have embodied anachronisms ? but the theory and practice of arts and humanities pedagogy I found at Brunel in 2011 remained in essence the same.
At the core of it all, it seems to me, lies the text. At Oxford, I often studied in Duke Humfrey’s library, the dark and woody 14th-century cell deep in the dense honeycomb of the Bodleian. Here, surrounded by ancient tomes, I did my best to impress upon my memory the outline of the canon while at the same time shading in a fraction of its content.
The entire system of learning at Oxford, so far as I can recall, consisted of the combination of mnemonics, composition and argumentation. Reading lists were prodigious: often 20 or 30 items ? both entire volumes and journal articles – so redundancy was a given: hours needed to be spent in the library to extract the pith from acres of paper. I took two courses (as modules were then called) every term, and the coursework requirement was an essay of 3,000 words per week for each of them; the sheer amount I had to write gave me the core facility needed for an entire adult working life as a professional writer.
The argumentation was, of course, astonishingly thorough when compared with the meagre “contact hours” most contemporary students are mandated: a full hour vis-à-vis, usually one-to-one, reading out your essay and then picking it apart. Lectures were plentiful and accessible, but I confess: with two hours of tutorials a week, and a minimum of 12 to write my papers, I needed all the remaining ones simply in order to read, if I were to be able to absorb sufficient information to substantiate the sort of large-scale theoretical paradigms I was being introduced to.
As I say, when I arrived at Brunel I found the lineaments of this system still present: reading lists and essay assignments; lectures, seminars and tutorials. I also realised immediately the deep commitment many of my colleagues had to serious, effective pedagogy, and their preparedness to do right by students facing massively increased pressures owing to the marketisation of the sector. However, what I also saw (and I believe this, in part, to be one of the many unforeseen consequences of the 2010 Browne “reforms”, which ushered in the tripling of tuition fees) was a strangely zombie-like response to the gathering impact of bi-directional digital media (BDDM) on the study of arts and the humanities. This is at once a vast subject ? and just one aspect of the technological revolution we’re living through; one of such scale, rapidity and obvious transformative potential, it deserves to ? and does ? generate ever more baroque and reflexive forms of appraisal and criticality. That being noted, there are simple things to be said, and for me they coalesce around a single conceptual object: the skeuomorph.
A skeuomorph is a formerly utile technology repurposed as a decorative element. The smartphone in your pocket displays an array of skeuomorphs on its home screen, from the little icon of an old Bakelite telephone indicating the programme initiating voice communication, to the one of a Box Brownie celluloid camera, which allows you to replicate visual phenomena in the form of pixels and voxels. There is absolutely no necessity for these applications to be advertised in this way, save for a certain perceived “added value” (yes, even in this numinous virtual/actual context) associated with redundant ways of doing these things.
What we wish to suggest ? as William Watkin limns expressively below ? is that our methods of textual study are currently skeuomorphic, and represent a futile and doomed attempt to repurpose formerly functional aspects of the previous knowledge technology ? mnemonics, composition, argumentation ? as decorative elements of the emergent BDDM era. Not, of course, that this is being undertaken with eyes open ? rather, all of us in possession of what Marshall McLuhan termed “Gutenberg minds” are, of necessity, incapable of perceiving the eventual form of a fully digitised knowledge technology. We know full well it’s upon us, though: its obvious manifestations being wholly destructive of our most precious skeuomorphs: the undergraduate essay and the academic paper.
At Brunel in 2011 I saw large signs posted all over the School of Arts and the Humanities (as then was) reading: “Plagiarism is an F for Fail”. The widespread concern that students are producing essays using a sort of bricolage ? grabbing what they want online and cobbling it together ? was and is countered by such poster campaigns, backed up by anti-plagiarism software. Obviously there was no real concern about plagiarism in the late 1970s ? and nor was there an algorithm to detect it: the papery battlements of canonical knowledge were sufficient defence; laboriously scanning and then copying from existing writings was tantamount to writing an essay anyway.
We still call upon students to write essays, but at Brunel – and I’m sure at many other universities ? coursework is written digitally, uploaded to Blackboard and marked digitally, including being checked with anti-plagiarism software if there are any suspicions. Really, this entire process is dedicated to maintaining the skeuomorph: the image of an A4 page of text on the computer screen is precisely analogous to the little icon of the Bakelite phone, or the tiny envelope that whooshes through virtual space when we send an email. It’s hard to see how we can expect students to maintain a belief in this decorative form as functional while we simultaneously go belly-up to BDDM.
In my view, the only way to maintain the previous dispensation would be to teach arts and humanities undergraduates entirely on paper, and ban them from using any electronic media whatsoever. Obviously this isn’t going to happen ? what we require is a new kind of teaching and academic assessment that responds to the profound changes in students’ learning style, and their associated cognition. William Watkin’s innovations on his violence module point to a possible approach. It is one which recognises that, henceforth, text will be met by the student almost entirely in a digital form; that the very definition of what qualifies as “text” will have to be extended to encompass mixed media digitally delivered; and that we are on the threshold of a new intellectual dispensation, in which the ability of the student to demonstrate the personal possession (and capability for regurgitation) of canonical book-learning cannot be the main criterion of attainment, since all canonical knowledge is almost instantly accessible to all.
Will Self is professor of contemporary thought at Brunel University London and is the author of nine novels.
This was the first year of my new, final-year literature course at Brunel, entitled Violence. It is, I believe, a unique course in the UK, allowing English literature students the opportunity to engage with the art, theory, politics and technology of violence.
The course is an innovation in how to study literary production in that it is not determined by a particular period, “ism” or theoretical approach. It is not even dominated by the need to study works of literature, taking, as it does, textual study in its absolutely widest possible sense. Novels are texts; paintings are text; philosophy is text; film is text. But YouTube is also text; Twitter is text; wolf whistles can be text. In the final week of the course, the reading set for the class said simply: “the internet”.
The idea behind the course was to let urgent issues dictate the nature of its design, rather than apply already tested frameworks of study to, say, representations of violence online. I envisage that this may be one way forward for literary studies and the humanities in general: let the topic dictate the course and then discover new methods and materials to answer its call. It is a form of curriculum design not dissimilar to the much-discussed phenomenon-based learning, which is centred on real-world phenomena rather than the abstractions studied by traditional academic subjects. Areas we covered included founding violence (or the role of dramatic acts of violence in founding and protecting states), discursive violence (how language and representation can be a form of violence), animal violence, sexual violence, sadism, cannibalism, scapegoating, punishment, surveillance, decapitation videos and, of course, zombies.
I also attempted to develop novel ways to deliver and assess the content. The lecturing team – which included my colleagues Will Self and film studies lecturer Daniele Rugo – were given three hours to present, interact, challenge and discuss whatever material they thought fit. I wrote original pieces of journalism, spoke from my blogs and used a complex set of platforms, including YouTube clips, feature films, internet image curation and Twitter, to discuss, say, the concept of founding violence or capital punishment. At one stage, I presented my zombie walk to introduce a discussion of the ontological issues surrounding the undead. This wasn’t just a joke. Performative lecturing techniques are essential if we want to keep the students in class and off their phones. A middle-aged man who should know better doing a, frankly, rather brilliant, zombie shuffle which segued into an admittedly inexpert moonwalk (the link being Michael Jackson’s Thriller video) is at least one way to get the students’ attention. But don’t worry: in the same session we also studied the Italian philosopher Giorgio Agamben’s ruminations on the homo sacer (a Roman criminal whom anyone was permitted to kill without being considered a murderer). I am not a complete charlatan.
The humour that finagled its way into the course was inevitably of the gallows variety: a necessary pressure valve in what could, at times, be a tense environment. The subject matter that we considered was enough to make you lose faith in humanity if you stopped to think about it – and, of course, stopping to think about it was precisely what we were encouraging. In the first weeks, I felt we skirted around the more unpalatable nature of violent imagery, almost using theory as a prophylactic against actual experience, but that couldn’t last. By the end we were tackling the decapitation videos produced by Islamic State.
The module came with a warning as to the nature of the material studied, which all the students seemed sanguine about. But in the case of the IS videos, there were also legal issues to be considered. When they first started to appear a few years ago, Scotland Yard warned that viewing them was punishable under terrorism legislation. Since then, the heat around the issue has diminished and no one has actually been prosecuted for watching the videos, suggesting either that the Met’s interpretation of the law was wrong or it was a Gutenberg organisation’s futile attempt to impose its will on a virtual realm that had already evolved far beyond its ability to police. However, it remained unclear to me whether formally showing the videos might place me or my institution at risk. I couldn’t get a clear steer on this in the time available so I made the decision to describe the videos while showing stills from them: something that turned out to be somehow more powerful and disturbing – at least to me – than watching the videos themselves.
In discussion later, I was rather taken aback to find that many of the students had never watched the videos and had no interest in doing so. My millennials explained that their Facebook accounts were flooded on a daily basis with images of violence sponsored by terror organisations that had somehow purchased access to their data from now-defunct websites that they had used years ago. For them, extreme images of all kinds were just the seamier side to the constant internet traffic that clogged up their online lives. They chose to look away not so much out of fear or moral revulsion as out of a new kind of ennui: a premature digital decadence. After all, they graduated to Facebook at the age of 15: by 21 they are veterans who have seen it all.
Issues of legality and taste, the taboo and the explicit, seem destined to become a future component of courses that interact seriously with the digital age. What I learned in the first year of my violence course is that there is a profound divide between how these issues are presented by we Gutenbergers and how they are experienced by millennials. The right approach for now seems to be to learn from the millennials while staying true to the attractive skeuomorphs – to borrow Will Self’s phrase – of traditional teaching, such as lectures, reading lists and intellectual contexts. But how long that delicate equilibrium will last is a pressing question.
When it came to assessing the course, it seemed pointless to simply set an essay. Instead, I asked the students to create and develop their own blog. They each then had to write at least four entries on any topic pertaining to the theme of violence. I read each of these, posted my comments and eventually gave them a mark out of a hundred.
Assessment by blog had unexpected and profound results. What I discovered is that in creating their blogs, the students taught me what their work should look like. They developed their own style and method, pitched between academic scholarship and whatever you call the way “normal” people read, think and write online. They didn’t compose academic pieces and then blog them: rather, they self-consciously interacted with the potentials and limitations of the medium. In the self-reflexive piece I asked the students to write to accompany the blogs, I learned more about social media from the perspective of digital natives than I ever had in years of studying the topic. I also felt that the students had learned a lot about themselves and their relation to digitisation: a discovery that for many of them was initially troubling and, finally, liberating.
These blogs broke with all the preconceptions I had accumulated over two decades of teaching. They were published before they were assessed, yet no one plagiarised the work of their peers, or of anyone else. For what I assume to be the first time in history, not a single student missed the hand-in deadline. Everyone wrote more than they were asked to. And nearly everyone got a first.
Since submission day, three of the students have agreed to work with me to bring together the 30 or so blogs into an online magazine that they will edit. This points towards a future for literary studies, extending the degree beyond graduation and allowing the students to demonstrate one of the most important of all transferable skills: how to create, maintain and expand an online communal profile.
When English literature courses around the world have to make a case for their very existence, my course seems to have stumbled on something miraculous: a conception of digital humanities that is actually challenging and interesting to the wider student body.
If our students can respond to contemporary issues by bringing to bear their skills as the first generation of digital natives; and if, from this, they can form lasting, culturally savvy digital communities, then surely the future is in their hands – even if those hands remain permanently clasped around their phones.
William Watkin is professor of contemporary literature and philosophy at Brunel University London.
Print headline: There will be blood