Back Home > Cover Story > Living in an Infodemic > A New Spin on Fake News
May 2024   |   Volume 25 No. 2

Cover Story


A New Spin on Fake News

Listen to this article:

What constitutes ‘fake news’? And how can we distinguish it from the real thing? Professor Rachel Sterken in Philosophy has been considering these questions in the context of new technologies and platforms such as social media, that pose challenges to the knowledge environment.

Which of these is ‘fake news’ – anti-vaxxers posting unfounded stories against COVID-19 vaccines on social media? Or the British newspaper, The Mail on Sunday, publishing a fake article about a climate change study? 

For Professor Rachel Sterken, Chairperson of the Department of Philosophy, who has published on the problem of fake news and leads a major project on meaning and communication in the information age, the answer lies not so much in the content of a news story as the processes it went through before and after it was published. 

“There has always been fake or deceptive news in the public sphere. But the way news is produced and consumed is changing as a result of social media and AI technology,” she said. 

“What gives a story its status as news are the standards and practices that a journalist applies in producing the story, such as verifying sources. These practices provide assurances that the information that circulates is important, relevant and true. If the procedures by which we produce, verify and distribute content are changed or eroded, then that could have serious implications for our epistemic environment.” 

Valid news outlets usually follow good news production practices, which means even if they produce incorrect stories, these stories are not fake, even if they are misleading, she said. “It’s important to recognise that good journalists can make honest mistakes. That’s not the same as fake news. The distinguishing feature of fake news is that there is no accountability and no proper procedures for verification.” 

Lines blurred between news producers and consumers 

Societies have procedures for holding news organisations accountable. The Mail on Sunday faced formal complaints and issued apologies in 2017 for the inaccuracy of its report. Individuals who spread fake news, on the other hand, are not held accountable in the same way. Anti-vaccination supporters have continued to spread fake information without sanctions across social media. 

“The ease with which social media can circulate news is also a challenge for journalistic institutions because they lose control over content distribution and curation,” she said. Platforms decide how to allocate stories to users based on their algorithms. They also blur the lines between consumers and producers of news by allowing consumers to create news without verification or other processes. “There are lone actors out there that break stories, but they aren’t journalists. They don’t necessarily produce stories via journalistic practices.” 

Professor Sterken suggested better content moderation could improve the knowledge environment, but it is not a replacement for fact-checking. Moreover, social media companies do not share traditional media’s goal of contributing positively to the epistemic environment. Rather, they mainly want to keep people on their platforms. They may also have no qualms about scaling back on content moderation, as seen with Twitter (now called X). 

“If you start to conflate news organisations and social media or collapse them together, trouble arises,” she said. “When news organisations distribute their content on social media, the way content is filtered down to the public and the way the public interacts and can respond to it is structurally very different from what it used to be.” 

Public debate becomes more difficult 

This difference affects public debate. Traditionally, readers could write a letter to the editor and engage in a public dialogue. Now, there is no structure to the conversation and no way to track it because stories and topics get jumbled together. “The recommendation algorithm messes with the relevance of particular journalistic stories to particular audiences, by moving them from one group to the next based on viewership. This makes it difficult to have a cohesive, well-structured public conversation and debate surrounding issues that matter to particular groups.” 

Professor Sterken believes society needs to give deeper consideration to the design and governance of social media platforms, and our epistemic environments. 

Even more changes are happening with the advent of artificial intelligence, which exacerbates the question of the quality and curation of information. AI can generate authentic-sounding speech and other content. Philosophers and others working in the area are thinking about how to distinguish authentic and synthetic speech online, whether the latter should be labelled, and whether content generated by AI and bots should be immediately removed given the source does not have free-speech protections. 

“Things are happening at a rapid pace, making it hard for institutional structures and processes to constantly reassess and understand what is going on,” she said. “We need to educate our students and the general public so they can understand and make sense of these issues as they arise. We also need to develop trustworthy sources again, find mechanisms to give status to information in ways that people find legitimate, and establish means by which we can productively talk to each other in the public sphere.” 

If the procedures by which we produce, verify and distribute content are changed or eroded, then that could have serious implications for our epistemic environment.

Professor Rachel Sterken

Professor Rachel Sterken