ÁñÁ«ÊÓƵ

Reproducibility in psychology ¡®hinges on author role in replication¡¯

<ÁñÁ«ÊÓƵ class="standfirst">Successful replication efforts heavily tied to whether original research team allowed a role, finds study of contentious psychology field
July 20, 2022

Fresh light has been shed on the?reproducibility crisis ¨C at least in psychology ¨C?by a study finding that successful replication efforts are heavily tied to the question of whether the original research team was allowed a role.

The Swedish analysis, based on a review of 65 published studies, were overwhelmingly judged successful when at least one original author was part of the replication team. But without any author from the original paper, absolutely none of the replication efforts upheld the original finding.

The analysis, by a team of psychologists at the University of Gothenburg, involved published studies on the topic of social priming. Social priming is the concept of using subtle verbal cues to significantly affect human behaviour. As with many endeavours in psychology, social priming is a proposition that has been ?from .

The new analysis ¨C now undergoing peer review through the open-access journal Meta-Psychology ¨C raises the question of whether ¡°social priming exists at all¡±, the authors?write.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

¡°It should be very worrying for social priming theory that we could not find a single convincing independent replication,¡± said the lead author, Erik Mac Giolla, a senior lecturer in psychology at Gothenburg.

The study also should serve as a broad warning for academic science, Dr Mac Giolla said, about the potential for distorting effects when original authors participate in replication efforts. People assessing quality in all fields should understand the value of considering only fully independent replications, he said.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Its publication came as the US got yet another warning that it risks losing ground on basic levels of research quality and impact due to a lack of government commitment to data sharing. That came in the form of a commentary ¨C by a nationwide team of US academic, government and corporate experts writing in the journal Science ¨C that chronicled the long-standing failure of US federal agencies to make their troves of government data more readily accessible to scientists.

By comparison, nations in Europe and parts of Asia, along with Australia, are doing much better at establishing the legal requirements and creating the resources to make such data-sharing possible, said the group¡¯s leader, Philip Bourne, professor of biomedical engineering at the University of Virginia.

Even when considering just the data generated by government-funded research, progress has been slow, said Professor Bourne, a former chief data officer at the US National Institutes of Health.?

¡°When I was at NIH, trying to work with NSF on anything was difficult,¡± Professor Bourne said, referring to the National Science Foundation. ¡°There¡¯s not the incentives to have that interrelationship.¡±

The concept of a reproducibility crisis refers to the growing concern in recent years that many scientific studies cannot be replicated, most commonly in the fields of psychology, medicine and economics.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

A leading force in studying and drawing attention to the phenomenon is the Center for Open Science, created by psychology experts from the University of Virginia who promote more standardisation and sharing in scientific research.

Its director and co-founder, Brian Nosek, a professor of psychology at Virginia, said he recognised the greater levels of governmental data-sharing in such places as the European Union and Australia.

The Center for Open Science has attracted nearly 500,000 users to its open-source data-sharing platform, largely from private philanthropic support, Professor Nosek said, with only ¡°piecemeal support from federal funders such as NSF and NIH¡±.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

The centre has conducted its own studies of reproducibility in large part to help demonstrate the need for better systems of scientific data-sharing. One of its biggest such studies, late last year, found that top-rated cancer researchers rarely share their data, and that their published conclusions usually fail to replicate.

Authors of ¡°social priming¡± studies may claim that they brought crucial insights to the successful replication efforts, Professor Nosek said. But both he and Dr Mac Giolla discounted that possibility, given the large number of failed replications among fully independent review teams.

¡°The problem,¡± Dr Mac Giolla said, ¡°is that unless the original author can specify what this crucial detail is, the finding is at risk of becoming unfalsifiable, as there are almost an infinite number of potential post-hoc explanations for a failed replication.¡±

Of the 65 published replication studies assessed by the Gothenburg team, 16 involved the participation of an author from the study being subjected to a replication effort. Of those 16 replication attempts, 12 reported affirming a ¡°significant priming effect¡± in the original study.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

¡°In stark contrast,¡± the Gothenburg psychologists write, ¡°none of the 49 replications by independent research teams produced a significant effect in the original direction.¡±

paul.basken@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT