ÁñÁ«ÊÓƵ

Big Data needs bigger oversight

<ÁñÁ«ÊÓƵ class="standfirst">The Cambridge Analytica controversy flags up the ethical perils of research with Big Data ¨C especially when it has commercial potential, says John Holmwood 
April 12, 2018
big data
Source: iStock

The scandal around Cambridge Analytica¡¯s use of Facebook data raises a number of ethical issues. The most important one concerns the oversight of academic research involving platforms?such as Facebook.

Using human subjects means getting informed consent. It is not clear if this was fully obtained in this case.

The business ethics dispute within the University of Cambridge¡¯s Psychometrics Centre between the developers of the , David Stillwell and Michal Kosinski,?and Aleksandr Kogan, who sought to use it in his work for Cambridge Analytica,?. This dispute turns on the value of the software and data. Kosinski and Stillwell apparently wanted to plough payments from Cambridge Analytica back into research,?although they also have personal IP rights in the app¡¯s commercial uses. In the end, Kogan developed a separate app, This is Your Digital Life, to generate similar data.

While Facebook has been criticised for facilitating access to data, few comments have been made about the wider ethics of academics gathering data via apps?such as these, or about the worrying issues associated with the monetising of academic research.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

In the UK, the research councils require all research involving human subjects to be reviewed by an ethics committee, to ensure proper safeguards are in place to prevent deception and secure the informed consent of participants. In the case of social media research, it would not be sufficient to claim that Facebook users have voluntarily made their data public?¨C either by the act of posting it, or as a consequence of the terms and conditions of the platform.

It is not clear if the myPersonality project was submitted to an ethics committee. The app¡¯s describes how consent was sought from participants interested in learning about their personality characteristics according to a common psychometric test. This , letting participants know that they could withdraw from the study at any time. They are not told that the data could be made available to other researchers ¨C or be used commercially.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

It appears that the true purpose of the app ¨C and the key to the monetisation of its data ¨C is revealed at the end of the test, when participants were asked to give permission to share their Facebook data. was sought on the basis that the researchers wanted to explore questions such as "do people who have conservative political views have a particular type of personality?" Respondents were assured that their ¡°data will not be published individually, but only as part of aggregate information¡±.

Despite the language of consent, these methods are dubious because they are not seeking fully informed consent. While covert methods can be justified ethically, that depends upon there being a higher purpose. The commercial value of the data being gathered would not constitute such a purpose.

More serious is that respondents are not told that giving their agreement at this point would also provide the researchers with access to the data of others. According to the Psychometrics Centre¡¯s provided by the database, around 40 per cent of respondents agreed to ¡°give access to their Facebook profile data and social network". This appears to imply that the wider dataset includes data on individuals who did not give any form of consent, since their data was provided by others. Yet it is?these data that constituted the commercial value of the project; this is evident in as including 3.5 million records of ¡°friendship triads¡± and other records of couples and ¡°potential¡± family members identified by matching family names and home towns.

ÁñÁ«ÊÓƵ

ADVERTISEMENT

No university research ethics committee should allow this form of data harvesting. Whatever the precise details of the motivation and data gathering in the myPersonality research, the case raises the potential of Big Data to undermine academic sensibilities. Research funders across the world are increasingly urging academics to conduct research with ¡°impact¡± and it is clear that many academics involved in Big Data are aware of its commercial possibilities. Universities and funders need to face up to their responsibilities and ask if they are doing enough to maintain ethical standards in the face of potential incentives to circumvent them.

John Holmwood is professor of sociology at the University of Nottingham.

<ÁñÁ«ÊÓƵ class="pane-title"> POSTSCRIPT:

Print headline: Big Data needs big rules

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.
<ÁñÁ«ÊÓƵ class="pane-title"> Related articles
<ÁñÁ«ÊÓƵ class="pane-title"> Related universities
<ÁñÁ«ÊÓƵ class="pane-title"> Sponsored
<ÁñÁ«ÊÓƵ class="pane-title"> Featured jobs
ADVERTISEMENT