ÁñÁ«ÊÓƵ

New book aims to empower readers to ¡®call bullshit¡¯ on data

<ÁñÁ«ÊÓƵ class="standfirst">Authors Carl Bergstrom and Jevin West put the case for scepticism at a time when science has become deeply polarising
July 27, 2020
Casual young man with head in cloud and coding flowing out
Source: Getty

There are many familiar kinds of bullshitter: the populist politicians; the literary scholars who use phrases such as ¡°the hermeneutics of the transfinite¡±; the ¡°shamanic energy medicine practitioners¡± who contribute to Gwyneth Paltrow¡¯s Goop website.

In their new book, Calling Bullshit: The Art of Scepticism in a Data-Driven World, Carl Bergstrom and Jevin West take aim at them all. Yet, perhaps surprisingly, their focus is on academic papers in the sciences and social sciences, and they start by pointing to the failure of ¡°higher education in STEM disciplines¡± to adequately teach critical thinking alongside essential technical skills.

The authors are based at the University of Washington, where Professor Bergstrom holds a post in the department of biology and Dr West is an associate professor in the Information School. Since 2017, they have been teaching , which, according to the former, filled up to the maximum class size of 180 ¡°in under a minute at midnight¡± when it was initially announced. The lectures have also proved hugely popular on

Although totally committed to the value of science, Professor Bergstrom is very concerned about what he calls ¡°new-school bullshit¡±, where ¡°people bring a slew of numbers and statistics and algorithms to the table. In our highly data-driven world, that¡¯s what constitutes authority right now, and people feel unable to push back.¡±

ÁñÁ«ÊÓƵ

ADVERTISEMENT

The novel coronavirus has made everything even more contentious. Professor Bergstrom was struck by how ¡°all the basic facts have been highly politicised¡±, even ¡°whether the virus exists, whether it goes away in the summer, whether masks work, whether hydroxychloroquine works [as a treatment]¡±.

Scientists were understandably committed to ¡°radical transparency¡± at a time of crisis, yet papers posted on open servers were immediately ¡°cherry-picked by people with an agenda as a cudgel to beat the other side with¡±.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

It was crucial, therefore, to learn to question all claims, even those that chime with our political preconceptions. Although Professor Bergstrom believed that President Trump¡¯s White House had ¡°taken on the mantle of the chief misinformation purveyor¡±, the authors found it hard to credit a Trump-bashing 2017 tweet from NBC News that international student applications were down ¡°nearly 40 per cent¡±.

On investigation, the tweet led to a news report that made the very different claim that applications were down at ¡°40 per cent of schools¡±, and that in turn led to a report that noted there had been a parallel increase in applications at 35 per cent of universities. So there was not really a story at all ? and certainly no evidence of a significant ¡°Trump effect¡±.

Students of every disciplinary background are well capable of developing a sceptical approach to what they read. Most of the bullshit in papers, said?Professor Bergstrom, comes not from complex statistical error or sleight of hand but from ¡°picking a non-representative sample or assuming causality when you only have evidence for correlation¡±.

Systems of machine learning, added Dr West, mean that ¡°data-dredging is super easy¡±. If one examines enough variables, one can always find chance and transitory correlations, such as one somebody spotted between the number of ¡°sociology doctorates awarded¡± and ¡°deaths caused by anticoagulants¡±.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Many of the problems can be found through a kind of applied common sense (even if undergraduate science courses often fail to teach this).

The key, explained Professor Bergstrom, is ¡°not to be afraid of all the tech talk and mathy stuff. If someone starts talking about how they did a multivariable logistic regression or used an MCMC [Markov chain Monte Carlo] algorithm, just ignore that. Put it in a black box ¨C you don¡¯t have to think about it; don¡¯t let them intimidate or bully you with that.¡±

A striking example from Calling Bullshit concerns a paper claiming that ¡°facial structure reveals criminal tendencies¡±. Most students would be quite incapable of challenging the statistical analysis. But they can easily come to see the sampling problems of a study that relies on comparing ¡°professional headshots¡± and ¡°government ID photographs¡± of convicted criminals, suggesting that the researchers¡¯ algorithm was not ¡°a criminality detector¡± but simply ¡°a smile detector¡±.

The authors¡¯ approach to statistics allows them to demonstrate why assessing universities on average class size, rather than average experienced class size, can be very misleading. To take an extreme example, if one class comprises 210 students while nine classes have only 10, the mean class size is 30 but the overwhelming majority of students will be facing all the challenges of being in a much larger class.?

ÁñÁ«ÊÓƵ

ADVERTISEMENT

Yet their fundamental goal, as Professor Bergstrom put it, is far more serious, namely to ¡°make people feel empowered to question numerical claims, in the same way they would feel empowered to question a bunch of weasel words from a politician or corporate spokesperson¡±.

Carl Bergstrom and Jevin West¡¯s Calling Bullshit: The Art of Scepticism in a Data-Driven World has just been published by Allen Lane.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

matthew.reisz@timeshighereducation.com

<ÁñÁ«ÊÓƵ class="pane-title"> POSTSCRIPT:

Print headline:?Think critically and don¡¯t be afraid to call data ¡®bullshit¡¯

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Related universities
<ÁñÁ«ÊÓƵ class="pane-title"> Reader's comments (3)
Science can be subjective as well - since the gatekeepers of research publication are people and people are flawed, it logically follows that scientific research can also be flawed. Editors and peer reviewers can be influenced by prevailing scientific hegemony (e.g., the most well accepted scientific explanation) and reject evidence to show contrary evidence. Science is now becoming more about 'toeing the line' rather an exercise of inquiry and investigative exploration. This is partly enforced by research funding whereby funding success is often linked to re-affirming the prevailing scientific hegemony rather than spearheading something radical and original. Hence, the replication crisis in many scientific research.
Remember Semmelweiss...
The medical profession would do well to read this - far too often they present 'research' which is only statistics and don't present any causal relationship at all. It's not just medics, of course, but it's particularly noticeable. The core message is to avoid using statistics like a drunk uses a lamp-post: more for support than illumination.
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT