Lessons Learned: How West’s vulnerabilities facilitate Russian influence ops


Euromaidan Press

Despite the evidence that Russia is trying to interfere in the U.S. midterm elections, are concerned about the administration’s alarming silence about what Moscow’s trolls and hackers are up to, administration POLITICO reports.

The Black Sea region has experienced the effects of an increasingly assertive Russia, according to Propaganda Made to Measure: How Our Vulnerabilities Facilitate Russian Influence,* a new report from the German Marshall Fund. Putin has turned the Kremlin’s asymmetric toolkit, including a mix of disinformation, support for political and social groups, and malign financial influence, on the region in order to influence discourse and undermine democratic institutions. To fight back against these tactics, it is necessary to understand the underlying vulnerabilities that Russia exploits, and the best practices and lessons learned from countries on the frontline of this assault.

Disinformation is not a new problem, but technology has enabled extreme political forces, previously denied a platform by mainstream media, to act with unparalleled speed and reach, RSF President Pierre Haski said. He cited a recent Center for Strategic and International Studies report that investigated how and why France was able to resist attempts to interfere in last year’s presidential elections.

Russia has gone on the offensive to undermine the democratic institutions and societies that underpin the system it sees incongruent with its own interests, say the Alliance for Securing Democracy’s Brittany Beaulieu and Steven Keil.  Russia’s incursions in Georgia and Ukraine are one particularly violent manifestation of this reality, while its influence operations and meddling in various democratic societies across Europe and the Americas indicate just how far Moscow is going to undermine the current international system and disrupt transatlantic cohesion, they write in a new report, Russia as Spoiler: Projecting Division in Transatlantic Societies:

Given this, it is critical for both Europe and the United States to examine the various drivers of Russian foreign policy, as well as the unconventional toolkit that Russia is relying upon to understand how best to combat the Kremlin’s efforts. As such, this paper looks at Russia’s influence operations and activities in 2016 and 2017 to demonstrate just how the Putin regime and its operatives are advancing anti-democratic efforts.

Fitting with the Kremlin’s operationally opportunist approach, Sweden, Latvia, and Bosnia-Herzegovina will likely be in-play during elections later this year, they add. RTWT.

“Other countries and malign actors are now adapting and improving on Russia’s methodology, notably including China which now runs disinformation campaigns and influence operations in Taiwan, Australia and other neighboring countries and is working to acquire information technology assets and data sets across Asia, Europe and the United States,” former assistant secretary of state Victoria Nuland (left) told the Senate Intelligence Committee last week “We know that they may very well do this again, so now we need to be planning what the retaliation will be — and we need to be signaling it,” said Nuland – a board member of the National Endowment for Democracy, the Washington-based democracy assistance group.

Disinformation is all about people – the manipulators and agitators who create it, and the citizens who consume it. Disinformation also isn’t new. It is spread using the technologies of the era, from Gutenberg in 1455 to Google today, notes Chris Doten, the Chief Innovation Officer at the National Democratic Institute. He details a few of the different problems NDI and its partners are trying to wrestle with, and some of our initial thoughts on basic categories of tools needed to detect and counter disinformation online, including:

Network Analysis – mapping communities, clones and cults. With social media, who is connecting to who can be as important as what they’re saying. What groups of people are sharing the same or similar information? Who is liking or retweeting each other’s content? Are there groups of accounts that behave in a synchronized manner to try and harass individuals, swamp conversations, or manipulate algorithms? Understanding how accounts (which may or may not be people) coordinate is very important for trying to understand the spread of disinformation.

Bad Bot Antidotes – bot detection, manufactured consensus, and the mob mentality. Speaking of “people or not” how do we know if an account is “real”? While some online accounts are obviously not human – most of us can’t tweet 100 times in a minute, for example – sometimes it’s hard to tell an automated account from a real person. The easy-to-spot fakes are rapidly being deactivated by platforms now, which leads to an arms race to make bots more and more lifelike and therefore harder to detect.

Now, not all bots are evil. They can be used for positive democratic impacts as well, particularly when people know they’re bots, but in the disinformation context, we’re usually thinking of their malign influence……

Content Analysis – the sites that cried wolf. Disinformation is spread via content – actual tweets, Facebook posts, or instant messages. However, it can be difficult verging on impossible to read a tweet on its own and know if it’s true or not – as much so for a human as a machine. Over time, though, sources of content develop more or less credibility. ….

Better Algorithms – algorithmic manipulation, or how to make viral sharing great again. If a fake news tweet falls in a forest and no is there to read it, does it make an impact? The answer is no; it’s only when information goes viral that it shapes mass opinion. Among the infinite river of content that flows by us, we only see a small percentage. Whether it’s “top news” or “trending topics,” algorithms surface the content that these automated systems believe we most want to see.

The European Values Think-Tank has published a public appeal, signed by 50 security experts from 18 European countries, presenting reasons why Nord Stream 2 will be Germany’s strategic mistake for decades to come. Those six reasons are:

  • Building up this extra pipeline will increase German political dependency on Russian energy, giving Moscow greater leverage for strategic blackmail.
  • Nord Stream 2 will bypass Germany’s Central and Eastern European allies and weaken the Alliance.
  • The Federal Republic of Germany will be de facto co-financing Russia’s war machine and Nord Stream 2 will further weaken the Alliance.
  • Nord Stream 2 will aggravate strategic corruption in Europe – as the Schroeder case shows.
  • Nord Stream 2 contradicts EU Energy Union principles and is clearly redundant.
  • Nord Stream 2 causes substantial environmental damage.

*Please join The GMF for a presentation of Propaganda Made to Measure: How Our Vulnerabilities Facilitate Russian Influence. This report, supported by GMF’s Black Sea Trust, provides insights gained from working with a number of organizations in the Black Sea region on how they are approaching asymmetric interference and what can be done to combat the issue.

Discussants: Oana Popescu, Founder, Global Focus; Brittany Beaulieu, Fellow and Program Officer, Alliance for Securing Democracy, The German Marshall Fund of the United States

Moderator: Jonathan D. Katz, Senior Fellow, The German Marshall Fund of the United States

Location: The German Marshall Fund of the United States 1700 18th Street NW, Washington, DC 20009

Print Friendly, PDF & Email