‘Historically unique threat to democratic discourse’: Autocrats waging ‘digital war’

     

Twitter

A digital ecosystem that can “defend and promote democracy” is needed to counter autocrats’ digital warfare, a top EU official said Thursday.

Russia and China* are waging a “digital war” with fake news and disinformation to undermine democracy in Europe, and the European Union must develop tools to fight back, said European Commission Vice President Vera Jourova (right), who leads efforts to preserve democratic principles across the bloc. The two countries have “weaponized information” and won’t back down until Europe stands up to them, she added.

“There are specific external actors, namely Russia and increasingly China, that are actively using disinformation and related interference tactics to undermine European democracy,” Jourova said. The EU is designing a strategy to respond to the threat, she told a conference of disinformation experts and policymakers in Brussels.

“The European Democracy Action Plan will be the response to numerous challenges to our democracy; but it should be broader than fighting disinformation alone,” she said, adding that the initiative would “strengthen the media sector, make platforms more accountable and protect our democratic process.”

Regulation on the activities of online platforms is needed to foster a digital ecosystem that can “defend and promote democracy”, Jourova added. However, she noted there could be challenges when addressing the “difference between paid political advertising and political opinions,” notes Digital Brief, Euractiv’s weekly tech newsletter.

Deepfakes represent a historically unique threat to democratic discourse, notes Jared Schroeder, an assistant professor of journalism at SMU Dallas and the author of The Press Clause and Digital Technology’s Fourth Wave: Media Law and the Symbiotic Web. A video clip is different than a text-based message. It puts the audience in the moment. Viewers see with their eyes and hear with their ears. It’s the most visceral form of information, he writes for The Hill. 

In the latest episode of the Power 3.0 podcast, featured guest Samuel Woolley discusses how human psychology helps drive individuals to share, consume, and believe disinformation, how these processes are already impacting politics globally, and how emerging technologies might exacerbate the challenge.  Woolley is an assistant professor at UT-Austin’s Moody College of Communication and program director of disinformation research at the Center for Media Engagement at UT. Christopher Walker, NED vice president for studies and analysis, and Shanthi Kalathil, senior director of NED’s International Forum for Democratic Studies, cohost the conversation.

For more on this topic, read Samuel Woolley and co-author Katie Joseff’s working paper, “Demand for Deceit: How the Way We Think Drives Disinformation.”

Deep fakes have become a go-to tool in the growing disinformation portfolios of both nation-states and non-state actors, The Soufan Center reports:

  • With the proliferation of social media and accessible software and technologies, there are many options available to anyone seeking to manipulate videos and images.
  • Given the pressure to act and the speed of warfare in the modern era, this means that fake images and videos have real world consequences and contribute to widespread disinformation.
  • A combination of technological, regulatory, intelligence, diplomatic, and civil society solutions are essential to even consider mitigating the threat and underlying challenges deep fakes pose to all strands of society. Get the full Intel Brief here.

*The #CCP is expanding its political and economic influence around the world, often at the expense of democratic institutions. But the United States can fight back by reshaping how it uses foreign aid, argue @IRIglobal‘s@patrickwquirk and @DaveShullman.

Print Friendly, PDF & Email