It is simply not an option for the social sciences and humanities to turn a blind eye to the reality of today’s interconnected world.
Recent technological innovations may hardly have been noticed in the sphere of modern languages, but they have exponentially increased production, retrieval and consumption of written and audiovisual information in over 500 languages, written in more than a hundred scripts. Falling back on past achievements and old disciplinary certainties is not an adequate response.
With my recent piece in Times Higher Education, I hoped to kickstart a discussion on how we could do better. To my surprise, however, my suggestions met with uniformly negative comments, mostly by anonymous posters, including verbally violent ad hominem attacks against me personally.
Hence, it was a partial relief to see a regular article responding to my invitation to discuss. Yet, to a degree, it retains the shape of a rebuttal-cum-protest letter. Its five co-authors stress that UK modern languages has already come to grips with the hyper-polyglot world – for example, by funding 120 projects over eight years that engage with the concept of multiculturalism. And the imperial legacy of the half-dozen European languages that such departments typically specialise in has been addressed by research on “nationalist and imperialist histories and ideologies”, shedding light on “how the legacies of European imperialism are experienced and contested”.
I do not question any of these achievements and worthy goals. But even if more critical thinking and research is now being done through the medium of these post/imperial languages, there is still no discernible effort to offer language acquisition in other relevant languages, let alone attain a similar level of discourse in them.
I dared to suggest that Google Translate, covering?more than 200 languages, or the Wikipedias available in over 300 languages could help. AI solutions also race on, while universities and modern languages either feign uninterest or half-heartedly play catch-up. I agree with the authors’ view that technology will not “solve all our problems”, but not engaging with the extant technologies is Luddite. I do not think the authors would shun technologies such as dictionaries, ballpoint pens or Microsoft Word.
“Advanced linguistic and cultural skills are essential for any deep understanding of how people see themselves and their relationship to the precarious and rapidly changing world around them,” say the authors. Again, I agree: deep understanding of a society’s culture comes only with fluency in a relevant language. But why seek that understanding only in a few European languages?
It does not make sense to research the history and culture of the Horn of Africa solely through the lens of the colonial idiom of Italian, without the use of sources in the languages of the region’s inhabitants (for instance, Afar, Amharic, Oromo, Somali or Tigrinya). Nor is it right that so much research into the Rwandan genocide relies predominantly on sources in French and English, even though both perpetrators and victims spoke, wrote and published almost exclusively in Kinyarwanda. To this day, the Holocaust is researched almost solely through the languages of perpetrators (German) and bystanders (French, Polish or Russian), to the exclusion of sources in the main language of victims (Yiddish – in whose alphabet, incidentally, numerous German-language publications were printed between the 18th?and mid-20th?centuries).
Obviously it is not for me to say exactly how modern languages should change. I certainly don’t have all the answers. But how about tapping our universities’ intellectual potential to come up with novel technological or organisational solutions to the polyglot problem?
It may not be economically viable to have experts on 500 different languages on a university payroll, but, after all, Swedish is studied in depth in Sweden, Czech in Czechia, Swahili in Tanzania. Why couldn’t language departments create and curate student and research exchanges with such overseas experts to fill their linguistic gaps?
This could be done, in part, to help students and scholars in other disciplines, such as history, economics or international relations, to access the languages they need for their research. Some of my critics said this would be to turn an intellectual endeavour into a mere service. But it is not such a novel idea. After all, scholars employed in IT departments both do their own research (into AI, for example) and work as technicians serving other departments. If modern languages departments took a similarly two-track approach, their language-brokering could both enrich their universities and earn them some much-needed revenue, like evening language courses do today.
Meanwhile, the academics within the department could do any kind of research for which they could get a grant, be it literature, history, philosophy, area studies or anything else. But I do think that candidates for such posts should be required to be functionally multilingual, not just bilingual.
Above all, departments need to be flexible and proactive in their language brokering, responding to both students’ and the country’s demands. Hence, some languages previously offered merely for acquisition – such as Ukrainian (plus Crimean Tatar) – could become subjects of teaching and study, while others (such as Russian) could be downgraded if the supply of experts in that area exceeds demand.
Is this such a horrifying suggestion that I should be abused for making it? Surely not. I hope it will be taken in the spirit in which it is intended: as a prompt for a civilised debate, not character assassination.
?is a reader in the School of History at the?University of St Andrews.