Looking around Britain, you would expect social science to be booming. High crime, persistent unemployment, underachieving men and underperforming institutions are crying out for explanation and illumination.
Being one of the first societies to deindustrialise and postindustrialise makes Britain a marvellous laboratory. But somehow demand is not being met by supply, and there is little sense of a creative ferment in Britain's universities.
This paradox came to mind reading the Economic and Social Research Council's latest annual report, and the results of an extensive consultation exercise it has been doing. The reports make fascinating reading, since they offer snapshots of a public organisation in transition, as it tries to reach out to users, to make research relevant.
But behind the reports lurks a question that is never quite openly addressed: the question of whether the disciplines themselves have become part of the problem. While in other respects universities have opened up, the social science disciplines have, if anything, closed in. There are now, for example, far stricter qualifications for getting a job than there were 20 years ago and far stricter requirements that applicants for grants fit into the professorial hierarchies.
ÁñÁ«ÊÓƵ
But at the heart of their control is the continuing power of peer appraisal. Despite a decade of attempts by the ESRC to innovate new models, research funding is still organised around the refereeing of proposals and decision- making by peers, just as within universities promotion depends more than ever on refereed articles.
The sense that disciplines have become closed systems, immune to failure, is most apparent in economics. Whereas in the commercial world economists have lost ground because of their failure to answer real-world problems, in the academic world huge grants continue to be allocated as if the discipline was still thriving.
ÁñÁ«ÊÓƵ
The predictable result is a bias against the unorthodox, the critical and the new. To get on, you have to conform with the dominant norms: to cite the leading figures in the field, to demonstrate how much use you are making of their wisdom. And to succeed in a career it is far more important to have published in the appropriate places than to have answered a question.
Given that most people agree that cartels are a bad thing, it is surprising that, far from being in decline, this model is getting stronger. While the ESRC has tried to encourage greater openness, HEFCE, which is meant to represent the public interest in funding universities, now looks suspiciously like another case of producer capture. Research assessment is organised within disciplines, and overseen by disciplinary boards in ways designed to keep outsiders at bay.
There is, of course, a need for quality, and for rigour. But if you allow hyperspecialisation and closed disciplines you end up with a parody of rigour and not the real thing.
Contemporary social science research reports are full of it: hugely complex algebraic formulae that are never tested on the real world, sophisticated multivariate analysis applied to wholly inappropriate datasets, and obsessive development of arguments within a single discipline when it is patently obvious that the problem cannot be properly framed without reference to other disciplines. In short, the bias against relevance is alive and well. Perhaps it is time all of us who depend on good quality social science research became a bit less patient and a bit less understanding.
ÁñÁ«ÊÓƵ
Geoff Mulgan is director of Demos, the independent think tank.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login