ÁñÁ«ÊÓƵ

Social research is being stymied by excessive ethical oversight

<ÁñÁ«ÊÓƵ class="standfirst">Projects that pose no risk to participants are being distorted or prevented by regulations designed for medical interventions, says Martyn Hammersley
April 13, 2023
Old style bicycle with flat tyres chained up on fence railings with syringes around the wheels to illustrate Social research is being stymied by excessive ethical oversight
Source: Alamy/Getty montage

Today, nearly all academic research involving human participants is subject to ethical regulation. Proposals must be approved in advance by an ethics committee, or what is referred to in the US as an Institutional Review Board (IRB). However, this has not always been the case, even in the field of medicine.

Until the postwar period, in the UK, the US and many other countries, what was ethical in medical research was decided by the doctors carrying out it out. While most exercised restraint, some early investigations were judged to be unethical even by colleagues. The researchers concerned were criticised for privileging the likely value of their findings, as well as the payoff for their own careers, over the interests of the patients involved in their investigations.

It was in response to these concerns that regulation of medical research began, initially in the US in the 1950s. There, the requirement for ethical regulation of federally funded projects came to be enshrined in law. This not only required IRBs to operate within institutions receiving federal funds for medical research, it also created an overarching bureaucratic structure that laid down requirements for IRBs, as well as monitoring and policing them.

Over time, the complexity of the requirements and the threat of suspension of federal funds to institutions for breaches led to IRBs and their supporting bureaucracies becoming substantial administrative departments. IRBs were charged with ensuring that research met the federal and local institutional requirements. Furthermore, the regulation came to be extended beyond research that received federal funding. And, later, it was applied to non-medical fields, including the social sciences.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

As Simon Whitney, a US doctor turned ethicist, shows in his new book From Oversight to Overkill: Inside the Broken System That Blocks Medical Breakthroughs ¨C And How We Can Fix It, the driving force behind ethical regulation of medical research has increasingly become institutions¡¯ concerns with protecting themselves against funding penalties and patient litigation. In the book, published last week, Whitney argues that the situation is now one of gross over-regulation, which does not even achieve its declared goal of protecting patients involved in research.

For example, the informed consent forms that are mandated for potential recruits to a study have become so complex and detailed that many patients are unable to understand them or are unwilling to spend the time trying. Indeed, patients frequently see them as unnecessary, designed only to protect the interests of the institution. Perhaps even more significantly, this over-regulation also costs lives by delaying the introduction of new treatments, by months or even years.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Whitney joins other critics of ethical regulation, such as Carl Schneider in his 2015 book The Censor¡¯s Hand: The Misregulation of Human-Subject Research. Their argument is not, of course, that regulation of medical research should be abolished, but rather that it ought to be more selective, focusing only on cases where there is a high risk of serious harm.

They also argue that it must be more flexible, attuned to the variable characteristics of particular forms of research and their distinctive institutional locales. This surely also applies to non-medical research involving human participants. Generally speaking, the risks of harm from research in fields such as psychology, the social sciences and the humanities are much less than in medical investigations. Yet, while the complexity and detail of the regulatory requirements are not usually as demanding in these fields, there has nevertheless been a creeping extension of the breadth and depth of regulation.

So the criticisms of Whitney and others apply to ethical regulation of these fields too. Timeliness can sometimes be just as important in applying the results of social research as it is in the medical arena, and ethical regulation introduces significant delays.

More worrying still is that risk-averse ethical restrictions can distort social research by ruling out particular methods or hampering their application. One small example is ethics committees¡¯ frequent demand that education researchers operating in secondary schools obtain informed consent not only from all participants who could be observed or may be interviewed, but also from the parents of the children ¨C requiring that they opt in. These requirements are not necessary in most cases to protect participants from harm, and they can stymie effective research.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

What all this highlights is that we can have too much of a good thing. While ethical regulation of research in some areas is clearly necessary, elsewhere it can damage not just the research itself but also the societal benefits from it. Regulation should be applied more selectively, and proportionately according to the risk involved, so as to minimise the harm it currently inflicts.

A great deal of social research, and even some in medicine, does not require regulation, and we cannot afford the consequences of the system now in operation.

Martyn Hammersley is emeritus professor of educational and social research at The Open University.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (3)
Assertions but not one bit of evidence. The misconduct of medical researchers and the damage long done to "human subjects"--consider the phrase--is the reason, and the continuing reason for ethical guidelines. Period. Past and present as exceptionally well documented
This debate has been a long time coming. We have had examples of student Master's dissertations disallowed, for example, on the grounds that people being interviewed might become upset, despite mitigations built into the research process. The same research has now, quite correctly, become a PhD in another country. A code of conduct for social science research is now required.
There is a problem with ethical approval for social and educational research, but it isn't entirely driven by medical ethics. In my experience, NHS ethics committees are quite reasonable, perhaps because they deal with such issues frequently. University ethics committees, on the other hand, tend to a more rigid application of what they think are 'rules', without considering the real likelihood of harms. However, if a consent form cannot be understood by the subjects, there is a design problem, and here ethics committees can help. There is also the problem of the use of routinely collected data, which by definition has no ethical consent, but, properly anonymised, is very powerful for resolving some educational and social issues. j.1365-2929.2005.02223.x20220616-1-113ahp1-libre.pdf
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT