Headlines such as “Call for ‘professional universities’ to overcome skills mismatch”?and “Industry-focused universities tackle Chinese skills gaps” are increasingly common as governments around the world focus on higher education as the foundation of successful knowledge economies.
The preoccupation with skills is a more recent variant of . Developed in the 1960s, this argues that increased education makes workers more productive. This generates more income for everyone, from individuals to organisations, sectors and entire economies. Hence, there is a case for individuals, organisations and governments to invest in it.
More specifically, it argues, there is a case for governments to increase investment in education related to work. But the plateauing of financial rewards from post-secondary education from around 2000 led theorists to argue that productivity was increased not, after all, by increasing education of any type but by developing workers’ skills demanded by changes in technology.
So now human capital theory has become prescriptive: post-secondary education should be increasingly concentrated on and then restricted to programmes thought to have most economic benefit. Employer surveys of variable quality proliferate, claiming shortages or mismatches of specified skills and urging universities and colleges to meet the reported demand for these skills.
But there is, at best, weak and contradictory evidence for a connection between technological change, demand for workers with higher or specific skills, educational institutions’ roles in developing those skills, increased productivity, and increased incomes.
So dominant has the language of skills become that universities are said to develop skills rather than graduates with skills, and employers are said to recruit skills rather than skilled workers. This disembodies skills from their exercise by people, and extracts skills from the social relations in which they are exercised. And such conceptual lapses are not merely academic.
This language fragments education. Rather than developing graduates with the knowledge, skills and attributes to be expert in their field, universities are “tasked” with developing critical-thinking skills, problem-solving skills, communication skills or whatever skills employers seek, as if they were ordering from an à la carte menu.
This language also fragments work. Employers traditionally employ workers in full-time jobs with fixed hours for whole careers. They pay overtime when work encroaches on employees’ leisure time, sick leave when illness keeps employees from work, family leave in recognition of employees’ lives as people and not just workers, and pensions to support former employees in retirement.
But thinking in terms merely of abstract, disembodied skills encourages employers to contract people to complete specific tasks and then to cast them aside, accepting no obligation to provide continuity of employment nor to pay any of the other entitlements that recognise workers as skilled employees.
This atomisation of education and the atomisation of work is greatly facilitated by the rise of micro-credentials, which are gig credentials for the gig economy.
The reduction of workers to skills returns us to the period before the First World War, when workers were reduced to “hands”, as in “field hands”, “factory hands” and “kitchen hands”. Jeremy Bentham explained in Pauper Management Improved, published in 1797, that he referred to people as “hands” to indicate their employment relation, in contrast to people who gained their livelihood from their social relations or their relation to land. Hired hands were understood to shirk work and were treated as commodities, “pressured by economic insecurity” and “administrative control devices” such as time clocks, as another writer put it.
Many workers currently face similar precarious employment, and new technologies have introduced much more pervasive surveillance. Bentham’s categories that have resonance include “out-of-place hands”, who have been recently dismissed, and “superseded hands”, who have been “rendered superfluous by the introduction of machinery”.
Even more corrosively, the skills fetish shifts the responsibility for employment from government and major economic actors to individuals. If people don’t have good jobs, it is because they have not exerted themselves to obtain the requisite skills. To compensate for the lack of good jobs, universities are pressed to develop graduates’ entrepreneurial “skills”.
Letting go of human capital theory and its modern variant, skills-biased technological change, will require changes from students, employers and institutions. Students need to adjust their expectation that completing the right qualification or learning the right skills will lead them to the right job. Rather, getting a good job depends on a broader range of economic and social factors, including the ways employers choose to structure work and the ways governments choose to regulate employment, industries and the economy.
Employers need to adjust their expectation that universities and colleges can and should produce graduates to meet their immediate and specific needs. Rather, they should restore their investment in their own employees’ induction and training, which they have cut by around 40 per cent over the past two decades in the UK, the US, Canada and Australia.
And universities and colleges need to stop justifying their size and growth on solely economic grounds. Post-secondary education has broader benefits and broader grounds for support. These may be less compelling to neoliberal governments than skills incubation. But they are no less important for it.
Gavin Moodie is adjunct professor in the department of leadership, higher and adult education at the University of Toronto and adjunct professor in education at RMIT University, Australia. This argument is elaborated in an open access article, “”, that he has co-authored with Leesa Wheelahan and James Doughney, published in the British Journal of Sociology of Education.