Democratic resilience demands tackling disinformation

     

 

 

A major European Commission review of a Code of Practice aimed at combating the spread of disinformation online has concluded the self-regulatory instrument is failing to deliver enough transparency or accountability from the tech platforms and advertisers signed up to it, TechCrunch reports.

Twitter

“The Code of Practice has shown that online platforms and the advertising sector can do a lot to counter disinformation when they are put under public scrutiny,” said Věra Jourová (right), Vice President for Values and Transparency.

“But platforms need to be more accountable and responsible; they need to become more transparent,” she added. “The time has come to go beyond self-regulatory measures. Europe is best placed to lead the way and propose instruments for more resilient and fair democracy in an increasingly digital world.”

The fundamentals of democracy can heighten susceptibility to disinformation: a free press and a culture of open debate allow conspiracy theories to flourish and noxious ideas to commingle with virtuous ones. How, then, to respond? Democratic institutions depend on the trust of citizens who share a factual universe, Joshua Yaffe observes in The New Yorker (above). “Active measures erode that order, but they do so slowly, subtly, like ice melting,” Thomas Rid writes in his encyclopedic, readable history of the subject, “Active Measures.”

He explains that “what made an active measure active was . . . whether it resonated with emotions, with collectively held views in the targeted community, and whether it managed to exacerbate existing tensions.” To “activate” anything, it had to hit at preëxisting tendencies and pathologies in society: disaffection, inequality, prejudice, aggression.

Although China once shied away from the aggressive, conspiratorial type of disinformation favored by Russia, it has increasingly turned to this approach during the coronavirus pandemic. Beijing is both manipulating factual information and spreading disinformation—or willfully false information—to distract from the origins of the virus, highlight the failures of the United States, and promote China as a global leader, notes CFR Expert Joshua Kurlantzick.

But China’s stepped-up efforts around COVID-19 are angering many countries, he observes:

A recent Pew Research Center poll of Americans, for instance, found that unfavorable views of China have reached a historic high, possibly in part due to China’s COVID-19 messaging. Chinese disinformation still seems more simplistic than Russia’s. Chinese fake social media accounts spreading disinformation about COVID-19 often appear shoddier than Russian ones and thus easier to expose. Still, some of Beijing’s disinformation punches are landing. And as China and Russia increase their cooperation on information and disinformation tools—sharing knowledge through exchanges—more dangerous messaging almost surely will increase.

In the latest episode of the Power 3.0 podcast, featured guest Claire Wardle, executive director of First Draft, discusses how the rapid spread of misinformation and disinformation has disrupted the global media space and offers suggestions for how journalists, media, digital platforms, and other civil society organizations can respond more effectively while preserving free expression and democratic institutions. Christopher Walker, NED vice president for studies and analysis, and Shanthi Kalathil, senior director of NED’s International Forum for Democratic Studies, co-host the conversation.

German Marshall Fund

The Alliance for Securing Democracy at The German Marshall Fund of the United States has released findings of their research ‘Covert Foreign Money: Financial Loopholes Exploited by Authoritarians to Fund Political Interference in Democracies,’ Kremlin Watch adds. The report flagged the problem that Russian and Chinese modern influence operations are not confined by disinformation or cyber disruption efforts.

Print Friendly, PDF & Email