Democracy’s Eleventh Hour? Safeguarding elections: 5 stages, 3 myths


Recent elections in the US, France and Germany indicate an emerging practice whereby autocracies meddle in democratic elections by hacking data, scandalizing it through leaks, and amplifying the effect by creating intense cognitive flows of disinformation and distrust across social media, according to a new analysis.

Election meddling now has a recognizable five-stage pattern, which allows for the development of algorithms that can detect signs of machined foreign operations in real time in cases of similar meddling patterns, Mika Aaltola of the Finnish Institute of International Affairs writes in Democracy’s Eleventh Hour: Safeguarding Democratic Elections against Cyber-enabled Autocratic Meddling:

  • First stage – using disinformation to amplify suspicions and divisions: Deliberate widespread foreign disinformation campaigning can be used to lay the groundwork for effective election meddling; however, more often than not, the objective is a more general weakening of trust in democracies. The objective is to abuse and heighten existing societal, economic, and political enmities, deepen polarization, and establish tactical links to useful parties and/or find colluding candidates.
  • Second stage – stealing sensitive and leakable data: If opportunities permit, and a geopolitically important election is approaching, the overall operation can adopt the more precise objective of election meddling, either to cast an election into disarray or to promote particular candidates or policies. The hacking of confidential campaign discussions can be useful for generating negative, scandalous publicity. Campaigns try to maximize their visibility, raise funds, build political networks with manifold actors, and make their message consistent yet appealing to specific constituencies.
  • Third stage – leaking the stolen data via supposed ‘hacktivists’: During the second phase of the operation, the emails and other documents were likely given to supposedly independent hacktivists. These may be mere fronts set up by an illicit actor. For example, the US election leaks involved an actor named Guccifer 2.0, which was likely a front set up by Russian state actors. The use of a deceptive front confounds attribution, distorts situational awareness, and hinders counter-measures.
  • Fourth stage – whitewashing the leaked data through the professional media: The information obtained through this cyber breach is leaked to the mainstream media. In the US elections, this was done mainly through WikiLeaks. In the heated election environment, leaks are easily judged newsworthy by the professional media. The professional US and international media are generally eager to publish such material after the worldwide attention achieved by the Manning and Snowden leaks. …..
  • Fifth stage – secret colluding in order to synchronize election efforts: A candidate, party, or a background group can create links and establish coordination with a foreign state to change the election dynamics. The coordination can be willing and conspiratorial in nature. The links of collusion can be established and nurtured over many years, or they can be brief and tactical. Collusion can also be opportunistic and may even lack direct contact between the domestic and foreign entities. Download PDF

Although journalists and news organizations are recognized as victims of disinformation, they can also be accidental aides to it, said Lisa-Maria Neudert, a researcher with the Oxford Internet Institute’s Computational Propaganda Project, speaking at the International Newsroom Summit in London today. Neudert listed three myths that need to be understood in order for journalists to work against the misinformation ecosystem:

Myth #1: All bots are stupid – “There are bots on social media that are communicating just as well as a human can, while distributing fake news on a scale that no human can do,” Neudert said. Many bots are so well designed that users on social media cannot identify them as being such, she continued….”Bots are getting smarter. A lot of money and resources right now in technology are being invested in technology interfaces – just look at Google Assistant, Siri and Amazon Alexa.

Myth #2: In data we trust – “Yes, data forms a key part of any newsroom and helps journalists understand what is going on and which issues are important right now, but the problem is often that the data we are seeing is being gamed,” she said. “A lot of bots don’t focus on communication, they drive metrics – likes, views and shares, which manipulate your metrics. When it is on a large scale, it is very difficult to detect those bots.

Myth #3: The fake news era will pass – “I think we are still in the situation where we are repeatedly being fed the information that fake news is an over inflated problem, but it is a difficult trend that will pass. But it’s not something that is going to go away,” she said….

At the heart of Russia’s tactics is the spread of disinformation designed to sow discord in our society – the same tactics used by KGB and modern Kremlin blackmail but now on a massive scale, says analyst Daniel Hoffman. Inoculating our citizens against Russian use of disinformation and blackmail starts with vigilantly spotting and exposing the Kremlin’s state-sponsored espionage tactics, he writes for Cypher Brief.

In their most recent article, Franklin D. Kramer of the Brent Scowcroft Center on International Security and Lauren M. Speranza of the Atlantic Council propose creating “a hub to monitor Russian activity, similar to NATO’s new hub for the south. This could be housed in Brussels with a joint NATO-EU unit, or perhaps in conjunction with NATO’s Joint Forces Command Brunsuum. It could also be coordinated at the new Hybrid COE in Helsinki. This hub—and existing hybrid intel units—should incorporate the ever-increasing threat analyses from private companies and non-governmental organizations such as cyber security firms and digital analysis centers.” [HT: Kremlin Watch]

The disinformation scene in the Czech Republic is relatively developed and intertwined with some of the country’s leading politicians, including president Miloš Zeman, analyst Markéta Krejčí writes for New Eastern Europe. Nevertheless, both the government and the civil society have recognized the threat and efforts have been made to address the problem.

Having lived through the collapse of two ideologies, tsarist and communist, Russia has been a post-truth society for decades, notes Anastasia Edel, the author of Russia: Putin’s Playground: Empire, Revolution, and the New Tsar (November 2017). In such a society, as long as there is an explanation, no matter how far-fetched, people will believe it, she writes for the New York Review of Books.

Maskirovka, which is Russian for “masking” or “camouflage,” is a foundational component of the Russian mindset, says Robert Dannenberg, former head of security for Goldman Sachs.

Part of the answer [to maskirovka] needs to be awareness in the mainstream media and social media providers of disinformation efforts and cooperation in the identification and removal of such content,” he writes for Cypher Brief. Read the full brief.

Print Friendly, PDF & Email