ÁñÁ«ÊÓƵ

Books editor¡¯s blog: even scholarly search algorithms betray bias

<ÁñÁ«ÊÓƵ class="standfirst">Matthew Reisz looks at evidence that ¡®seemingly objective search tools¡¯ used in libraries can produce baffling and biased results 
June 20, 2019
Source: iStock

It is now widely acknowledged that Google and Facebook shape the way we see the world. Matthew Reidsma, web services librarian at Grand Valley State University in Michigan, would like us to be equally wary about the ¡°discovery systems¡± used in libraries such as Ex?Libris¡¯ Summon and Primo, OCLC¡¯s WorldCat Discovery and EBSCO¡¯s EDS. His new book, Masked by Trust: Bias in Library Discovery (Litwin Books), provides many sobering examples of the results produced by these ¡°seemingly objective search tools¡±.

A search for ¡°mental illness¡±, for example, led straight to a Wikipedia article on the controversial 1961 book by the psychiatrist Thomas Szasz, The Myth of Mental Illness, which might suggest to users that ¡°the topic they are studying is nothing more than a myth¡±. Searches for ¡°9/11¡± offered as their?top result a book arguing that ¡°the official story can¡¯t possibly be true¡±, while another search for ¡°September?11¡± made no reference to the 9/11 attacks but flagged up the date as ¡°the first day of the Coptic calendar¡±!?A request for information about ¡°women in prison¡±?produced?results about ¡°women in prison films¡±, an exploitation genre?that sheds little light on the real experiences of female prisoners. Furthermore,?the service?offered as a related topic ¡°sex in films¡±, a subject?that?shares only the single word ¡°in¡± with the original search.

Reidsma cites the notorious case when Google Photos automatically labelled images of two black friends as ¡°gorillas¡±, its algorithm somehow ¡°dredg[ing] up hundreds of years of institutionalized racism¡±. Some of the discovery systems he examined also seemed to have prejudices built into them. One search for ¡°immigrants are¡±?returned?three results: a book title, a search on whether ¡°immigrants are good for the economy¡± and the straightforward ¡°immigrants are bad¡±. The autosuggest for ¡°Asians are¡± came up solely with ¡°Asians are good at math¡±. Perhaps oddest of all, a question about the Catholic practice of ¡°lay investiture¡± offered as related topics ¡°Fuck¡± and ¡°Gay¡±.

In trying to explain why our algorithms sometimes ¡°spit out hate and bias¡±, Reidsma points to the lack of diversity among engineers and ¡°a?culture that glorifies efficiency above all else¡±. Even where companies agree to put things right, they treat any problems as ¡°bugs¡± requiring technical fixes rather than as touching on deeper ethical issues. When a search for ¡°stress in the workplace¡± returned a Wikipedia article on ¡°women in the workplace¡±, for example, a manager at Ex?Libris explained to the author this was a result of a phrase-matching system (and that, therefore, typing ¡°heroes in the workplace¡± would also have directed users to ¡°women in the workplace¡±). What this failed to address was the possible impact on young women, in ¡°a culture that devalues their contributions in the workplace both culturally and monetarily¡±,?of their?¡°see[ing] working women equated with stress in the workplace in a supposedly objective, neutral tool that everyone in the University tells them will give them the most objective, scholarly information¡±.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs