Back Home > Cover Story > Living in an Infodemic > Reining in Tech Platforms
May 2024   |   Volume 25 No. 2

Cover Story


Reining in Tech Platforms

Listen to this article:

Fake news and misinformation are easily published and circulated on platforms such as Google, Facebook and YouTube. There are moves to make these media responsible for the messages they convey.

In 2022, US courts ordered American talk show host and conspiracy theorist Alex Jones to pay more than US$1 billion in damages to the parents of several of the 20 children murdered at Sandy Hook Elementary School by a gunman in 2012. Jones had been claiming online since 2014 that the deaths were a ‘hoax’ and the parents ‘crisis actors’ and, until 2018, platforms such as Facebook and YouTube allowed his content to be posted and shared. They removed him that year for a range of offensive content, but the question lingered: what obligation did those platforms have to keep such fake and harmful information out of the public arena? 

To Dr Marcelo Thompson, Adjunct Associate Professor in the Faculty of Law, the answer is clear – platforms must do more to moderate their content. Unfortunately, while many platforms are global and their content circulates worldwide, the laws that govern them are set locally, impeding convergence, he said. 

On the one hand is the US, where policy adopted since the mid-1990s has left platforms unchecked regarding their moderation decisions. The initial intention was to encourage growth and investment in the tech industry without the worry of liability, and to support freedom of expression. But this in turn removed most incentives for platforms to take such a core aspect of their mission seriously. 

“In the US is also the belief that moral choices online should be left to individuals, and the law shouldn’t be demanding state intervention on moral grounds,” he said. “However, individuals can only truly choose if platforms themselves are neutral, which is far from the case. Global convergence depends on acknowledging this reality.” 

Protection on moral grounds 

In China and increasingly in Europe, protecting the public interest involves intervening on moral grounds. Europe does not explicitly acknowledge this parallel, but its Digital Services Act has introduced the idea of regulating against societal harm, such as through misinformation. 

“Europe claims that it is not enforcing morality but rather European values. But the very idea of freedom of expression has a moral dimension, as that freedom can be restricted on public interest grounds as long as the restriction is necessary and based on law,” he said. 

“Large language models are pervaded by moral forms of reasoning through their algorithms. We need to acknowledge that everyone is doing the same thing, and then the discussion becomes, how far should we go?” 

The pandemic is a case in point. China stepped in at the beginning to restrict information flows to prevent panic, while in the US there were no restrictions on misinformation circulating online, which drove anti-vaccination sentiment and the adoption of questionable treatments. 

“In order to protect individuals, you need the state to step in to decide on the nature of decisions that are made by platforms that are increasingly displacing states as a locus of power,” he said. 

Admittedly, it can be complicated for states of all kinds to regulate tech platforms when they are beholden to them. “There is a paradox that the state is increasingly dependent on the knowledge of the very actors it seeks to regulate. Unchecked platform power is a challenge for the future of the law and the future of the state,” he said. 

Inching to the Chinese approach 

A case in point is Hong Kong. Dr Thompson recently produced a report for the Chief Executive’s Policy Unit on the responsibility of technological platforms here. The report suggests that, while national security concerns are of paramount importance, they need to be followed by a broader range of public interest considerations. More power may mean less when platforms could easily pull out in response to stricter government demands. However, a softer, public-interest-based approach, drawing on technological standards adopted through a wide public consultation, could at the same time increase available regulatory alternatives, garner greater public support, and ultimately further Hong Kong’s international standing, he said. 

Dr Thompson also thinks the city should look at the situation in Mainland China, where people have more data protection rights in relation to social media platforms than Hong Kong residents, at least when it comes to private actors. China’s Personal Information Protection Law applies to the government, too, although in a ‘looser’ way, he said. But Europe also offers public interest exceptions in its data privacy laws. 

“China saw this issue coming and they have been regulating from the outset. Europe has this illusion of being so different from China, but it is inching increasingly closer to the Chinese approach by addressing issues of societal harm and addressing what is essentially a moral dimension,” he said. As the big players of Europe and China converge, the outlier remains the US.

There is a paradox that the state is increasingly dependent on the knowledge of the very actors it seeks to regulate. Unchecked platform power is a challenge for the future. 

Dr Marcelo Thompson

Dr Marcelo Thompson