The changing nature of media has always made it difficult to teach media literacy. Social media has complicated it further, with platforms appearing and disappearing frequently. But the challenges posed by disinformation necessitate a full re-evaluation of our approach.
The Mueller report has made the work of Russia¡¯s Internet Research Agency (IRA) well known.?Funded by oligarch , known as ¡°Putin¡¯s Chef¡±, this government-linked organisation and its successors work to influence cultural and political conversations across the West. Using fake social media accounts, Russia engages in what is, in effect, guerrilla marketing, pushing conversations to extremes with the goal of dividing countries and weakening democratic institutions.
Higher education is among its targets. We have spent the past two years building an understanding of Russian tactics and strategies, combing through of IRA posts. Russian disinformation is simultaneously than even the most educated social media user may assume; it is abundantly clear that we are ill prepared to combat it.
Even before we fully appreciated the threats posed by coordinated disinformation campaigns, the technology and social media scholar Dana Boyd worried that traditional efforts to promote media literacy by encouraging students to question and doubt their sources may have been , undercutting their belief in expert opinion and objective reality.
ÁñÁ«ÊÓƵ
Sure enough, recent research from the gives credence to these concerns, with the majority of Americans saying they have lost trust in the media. It should come as no surprise, then, that the term ¡°fake news¡± is now a weapon used by ideologues on both sides of the political divide against mainstream media. Doubting not just established facts but also the very mechanisms that produce them allows individuals to speak more easily past one another, deepening ideological division.
Critical thinking remains essential and should be taught, but it may not be the best or only tool to improve social media literacy. It is also vitally important to consider where that critical thought is being directed.
ÁñÁ«ÊÓƵ
Recent research suggests that concerns about fake news on social media may be based on incorrect assumptions about its prevalence. A study in found that only 0.1 per cent of Twitter users were responsible for sharing 80 per cent of fake news posts during the 2016 presidential election. And a paper in found similar results for Facebook. To the extent that fake news was a problem, it was largely confined to Baby Boomers: users over 65 were nearly seven times more likely than the youngest cohort of users to share fake news. Perhaps the teaching of digital literacy should be focused on retirement homes.
In practice, coordinated disinformation has little to do with fake news. Certainly the IRA¡¯s early efforts in 2014 and early 2015 were full of attempts to scare us with accounts of events ¨C a chemical explosion in Louisiana, a phosphorus leak in Idaho and a salmonella outbreak in upstate New York ¨C that never occurred. But these efforts were not successful. Since 2016 Russian disinformation concentrated on spin and public relations. Russian trolls don¡¯t troll; they make new friends and turn their trust into influence, feeding them real news that they frame to push an agenda.
Exacerbating the problem, the Russian trolls employ separate accounts to give different spin to different online communities. To conservatives, the accounts depict a world run by a corrupt ¡°deep state¡±, in which institutions are rigged and the mainstream media cannot be trusted. To liberals, they describe a world far more racist and misogynistic than they ever feared. These users are told that anyone wearing a red #MAGA hat is to be hated and that the president is not legitimate.
Russia is not the only offender. Other countries and entities are now using the same tactic, each with differing agendas. Troll factories are known to be run out of both . Twitter recently a network of fake accounts pushing for Catalan independence from Spain.
ÁñÁ«ÊÓƵ
Our proposal to combat this growing threat is to teach . We can start by shifting the focus of media literacy away from the product and towards the process, promoting a better understanding of interpersonal communication.
Most Americans receive media through a unique they helped to construct. It is essential we teach students, of all ages, to understand their own lens and the relational dynamics of social media. They should learn to consider their own and others¡¯ biases, and the nature of their relationship to those who send them messages on social media. They should also be taught to consider the message-senders¡¯ standpoints and possible reasons for sending the messages ¨C and how all of these elements influence the way messages are interpreted.
In other words, students need to know that they should not agree with another social media user simply because they used the same hashtag. Nor should they accuse someone of being a Russian bot simply because they disagree with their viewpoint.
The problem of coordinated disinformation will get worse before it gets better. But teaching students to fully appreciate the influence of social mediation on their understanding of media will not only help them fight disinformation, it will help them better interact with everyone else online as well ¨C from their super-liberal co-worker to their ultra-conservative uncle.
ÁñÁ«ÊÓƵ
Darren L. Linvill is an associate professor in the department of communication at Clemson University in South Carolina. Patrick L. Warren is an associate professor in the John E. Walker department of economics at Clemson University in South Carolina
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to °Õ±á·¡¡¯²õ university and college rankings analysis
Already registered or a current subscriber? Login