Total information warfare an ‘existential threat’ to Western pluralism

     

In a “shocking new report,” the Canadian Security Intelligence Service (CSIS) is warning that the spread of disinformation online poses an existential threat to “Western democratic pluralism,” according to The Toronto Star.

“Conventional journalism has been partially displaced by a torrent of data from an infinite number of originators. Within that torrent is a current of lies and distortions that threatens the integrity of public discourse, debate and democracy,” the report states, citing Russian disinformation, pro-Brexit hyperpartisan social media, and China’s ‘sharp power’ influence operations as features of “total” information warfare.

“Disinformation poisons public debate and is a threat to democracy. Raised public awareness is needed to distinguish the real from the false,” the report reads. “There are many ways for governments and organizations to counter the threat, but there is no guarantee that even effective counter-campaigns can defeat the high volume flow of malicious communications.”

In the aftermath of the 2016 American presidential election, Western democracies awoke to an uncontrolled and potentially dangerous world of political disinformation, targeted ads, bots and foreign interference, CBC reports.

“My hope is that over the course of the spring that we have some more in-depth conversations with social media companies. We’ll have to see where those go and how we think they’re progressing,” Karina Gould, Canada’s minister of democratic institutions, told CBC News. “Are we going to see something more robust in the next six months? If we don’t see something more robust in the next six months, then we need to take action.”

To counter this threat, our best guide isn’t Cold War-like containment. Instead, we should model our efforts on our largely successful approach to combating terrorists — who also seek to weaken democracy from within, Georgetown University’s Joshua Geltzer and Charles Kupchan write for The Washington Post. We should adapt three main pillars of the U.S. approach to counterterrorism to counter Russian interference more assertively, they contend:

  • First, we need to adapt our framework for tracing and blocking terrorist financing to the current threat of information warfare. Congress should quickly pass legislation that would criminalize accepting assistance from a foreign government aimed at influencing elections. As it did for financial institutions, Congress should also take steps to make social media companies and file-upload sites more accountable. …
  • Second, we need to punish and deter those who interfere with our democracy. The fight against terrorism has involved a sustained military and intelligence campaign against terrorism’s instigators and sponsors. Similarly, Russia must pay a higher price for its efforts to undermine American democracy. …
  • Third, just as public education has helped mitigate the political impact of terrorism, greater awareness that the Kremlin is deliberately seeking to pit Americans against themselves can help make the public less susceptible to manipulation. … Bolstered by broader civic education about the use and abuse of social media, this public outreach would make many Americans more skeptical consumers of the combative material bombarding them online and increase pressure on Congress and state governments to harden voting systems.

“We don’t have a list of Russian troll accounts in Europe, similar to what we have for the US,” acknowledges Ben Nimmo of the Atlantic Council’s Digital Forensic Research Lab (DFRLab), which studies online influence operations, The Economist reports:

In Germany Mr Nimmo identified a Russian botnet—in this context, a network of mutually reinforcing bots—that amplified right-wing messaging in the week before the German election in September, promoting #Wahlbetrug (“election fraud”) as a hashtag. Beforehand the botnet had spent its time promoting pornography and commercial products. It may have been a freelance rent-a-botnet also available for far-right messaging; it may have been a Russian operation. The difference can be hard to see.

The National Democratic Institute launched INFO/tegrity, an initiative to ensure that efforts to protect political information and democratic discourse from manipulation are applied across all NDI programs. Since its launch and with the support of the National Endowment of Democracy (NED), NDI has continued to expand its in-house capacity and its external partnerships to advance the following key elements of the initiative, writes Daniel Arnaudo (right), NDI’s senior program manager for governance:

  • Conducting Research on Vulnerability to Disinformation Across Demographic Groups and Regional Contexts. NDI has been piloting a range of innovative opinion research methodologies and approaches to understanding which segments of the population in a country are more resistant to disinformation, and which are more vulnerable — including understanding how various strategies and types of disinformation impact groups differently. The research is informing approaches to building resistance to disinformation, as well as efforts to counter it. As research is conducted in more country contexts, it will provide a more nuanced understanding of the ways in which vulnerability and resistance to disinformation vary across political contexts. This work is complemented by studies on the connections between online harassment and violence and political engagement, particularly on women and marginalized groups.
  • Monitoring Disinformation and Computational Propaganda in Elections. After writing the Brazil case study on computational propaganda for the Oxford Internet Institute, I was fortunate to begin my career at NDI by serving as a disinformation analyst in a long-term election observation effort for the Georgian local elections. Based on these experiences and the experiences of citizen monitoring organizations around the globe, NDI is developing and sharing guidance on how to integrate monitoring of disinformation and computational propaganda in elections.
  • Strengthening Political Party Commitments to Informational Integrity. Hacking of political parties can often be a component of disinformation campaigns. Working with Harvard University’s Belfer Center, NDI recently conducted a series of consultations with political party partners in a number of at-risk countries, to explore various approaches and techniques for addressing cybersecurity vulnerabilities.
  • Helping Social Media Platforms and Tech Firms “Design for Democracy.” NDI has maintained a presence in Silicon Valley since 2013, focusing on a range of issues from civic tech to open government. NDI has increasingly been asked by partners to help resolve issues relating to social media platforms, in the face of political sabotage or other efforts to misuse social media platforms. NDI is currently working on a more structured mechanism for ongoing communication and dialogue between the democracy community and social media and tech firms, with the goal of helping these firms to “design for democracy.”
  • Sharing Tools to Detect, Analyze and Disrupt Disinformation. In June 2017, NDI co-hosted the Global Electoral Integrity Dialogue in Tbilisi, Georgia, which brought together leading electoral management bodies, citizen observers, and international representatives to discuss experiences and challenges to addressing disinformation in elections. In September of last year, NDI convened a discussion on disinformation in elections at the Human Dimensions Implementation Meeting in Warsaw, Poland, under the auspices of the Organization for Security and Cooperation in Europe. NDI is currently working with the election management body in Mexico (INE) to share international learning on disinformation and effective approaches for countering disinformation; stay tuned for a subsequent blog post on an upcoming event that NDI is co-sponsoring with INE in Mexico City.

In light of last Friday’s indictment of Russian nationals and Kremlin-linked companies for meddling in the U.S. presidential election, it’s a good time to revisit a prescient 2016 paper on Moscow’s propaganda approach, RAND asserts. The authors outline what makes Russia’s ‘Firehose of Falsehood’ so effective and offer options for countering it. Read more »

 

Print Friendly, PDF & Email