Current proposals around disinformation are described in negative terms: they are all about stopping “harms” and mitigating “dangers,” notes Peter Pomerantsev, a director of the Arena Program at the London School of Economics, and the author of Nothing is True and Everything is Possible: The Surreal Heart of the New Russia.
When we frame regulation as a negative, the result can play into the hands of authoritarian regimes such as Russia, whose leadership is only too happy to quote censorious Western laws as it censors opposition at home. Authoritarian regimes will do as they do and often there’s no doing anything about it, but we should not be setting the terms of the debate on information in a way which a priori lead us toward a vision of the future of the internet which they desire.
As a new Transatlantic Working Group on Content Moderation I am part of has been discussing, regulation (whether government- or industry-led) needs to veer away from a focus on content to a broader concept of “viral deception,” he writes for The American Interest:
[UN Rapporteur on Freedom of Speech and UC Irvine professor David] Kaye’s concise, elegant and necessary book—The Global Struggle to Govern the Internet—shows how tech company bosses have publicly stated they want human rights law to become the basis for how their platforms are run. I wonder what this will mean in practice. Online human rights courts that adjudicate on content and behavior in almost real time? How will we adapt our thinking about freedom of speech in an environment where censorship happens not only through shutting people up but through creating so much noise the truth is lost—and where old truisms such as “more speech is the remedy to disinformation” have been found to be not so necessarily true after all? RTWT
Social media companies are woefully unprepared to address the problems that their products cause, writes Luminate analyst David Madden:
Local civil society organisations can play a critical role. By carefully documenting the ways in which these social media companies are failing to uphold their own “community standards”, local groups can force these companies to engage with the harm that their platforms are doing. Luminate grantees Phandeeyar and MIDO in Myanmar demonstrated this with their #DearMark campaign. To encourage the level of engagement that Facebook is now having in Myanmar in other countries where social media is a problem, Luminate has been funding think tanks and civil society groups to do the kind of research and advocacy work that Phandeeyar and MIDO have been doing.
MEMO 98 analysed Facebook accounts of parties running in the European Parliament elections in the Czech Republic, Hungary, Poland and Slovakia to evaluate the role of Facebook during the elections and its potential impact of the messages disseminated through this social platform on election integrity, and thus public trust and confidence in the process.
More specifically, the Slovakia-based NGO sought to evaluate the topics, issues and narratives the parties presented on public Facebook accounts in the run up to the elections. Find the results here.