West’s democracies ‘fighting yesterday’s war’ against Russian disinformation & malign influence


The United States has “fallen behind” in addressing the threat of foreign disinformation, but it is not too late to change course and adopt a more proactive approach, a Capitol Hill hearing was told today.

“While the democratic West is fighting yesterday’s war, our adversaries are evolving and adapting to the new playing field,” Brookings’ analyst Alina Polyakova told the House Appropriations Subcommittee on State, Foreign Operations, and Related Programs hearing on United States Efforts to Counter Russian Disinformation and Malign Influence:

  • First, innovation in artificial intelligence (AI) is enabling the creation of “deep fakes” and other “synthetic media” products. Using video and audio manipulation, malicious actors can
    manufacture the appearance of reality and make a political leader appear to make remarks that they did not. As these tools become more low cost and accessible, they will become perfect weapons for information warfare. Such technologies could drive the next great leap in AI-driven disinformation.
  • Second, disinformation techniques are shifting from the use of simple automated bots to more
    sophisticated interaction with (and manipulation of) domestic groups, extremist and otherwise, through various forms of impersonation and amplification of organic posts by domestic actors. Thus, it is already increasingly difficult to disentangle foreign-origin disinformation from domestic social media conversations. Rather than trying to break through and channel the noise, the new strategy aims to blend in with the noise—obfuscating manipulative activity and blurring the line between authentic and inauthentic content.

Credit: eu vs disinfo

To address the challenge of Russian active measures, said the Kennan Institute’s Nina Jankowicz, Congress should invest more in programs that:

Teach people how to navigate the modern information environment including through
digital literacy training and civics programs. These programs would not simply teach
people to separate “real” and “fake” news, but assist them in sampling a range of
viewpoints to inform their daily lives and the criticism that is healthy for any democracy,
while developing greater immunity to conspiratorial versions of the truth. The most
impactful programs are likely to be presented outside of the context of responding
directly to Russian disinformation, such as IREX’s Learn to Discern program.
Inject more reliable information into the ecosystem. Radio Free Europe and Voice of America are invaluable resources in the Europe and Eurasia region, even in countries with a seemingly robust media environment….. The United States should also invest in the sustainability of local and independent media outlets. Often USG-funded programs focus on capability-building though there is a great deal of excellent journalism being done in the region. (Independent
Russian journalists were the first to uncover the so-called St. Petersburg “troll
factory,” for instance, and the Organized Crime and Corruption Reporting Project
relies on networks of local investigative reporters to break large, complex stories
such as the exposure of the Panama Papers.) …..
Engage people in countries on the front lines of the information war with firsthand
educational and exchange experiences in the United States. It is impossible to calculate
the return on investment of programs including Fulbright, the International Visitor
Leadership Program, and the Future Leaders Exchange Program. These experiences are
more powerful than any fact-check or counter-disinformation program; they provide
participants with a firsthand look at American governance, values, and culture.

Launched in 2017 with Congressional support, the Current Time television and digital network provides Russian speakers across Russia, Ukraine, Central Asia, the Caucasus, the Baltics, Eastern Europe, and as far away as Israel with access to accurate, topical, and trustworthy information, notes John F. Lansing, CEO of the US Agency for Global MediaIt serves as a reality check on the disinformation that drives conflict in the region.

Media space funding consists of four major categories of activities: media literacy, increased access to objective information, capacity-building, and strategic communications, according to Jim Kulikowski, the State Department’s Coordinator for U.S. Assistance to Europe, Eurasia, and Central Asia (AEECA). It is implemented through USAID, DRL, embassy public affairs sections, and Assistance to Europe, Eurasia and Central Asia (ACE) through a grant to the National Endowment for Democracy, told the hearing.

Hitoshi Kokumai

Specific counter-Kremlin efforts fall into three categories: Analyze, Build, and
Communicate, said Lea Gabrielle,
Special Envoy & Coordinator for the State Department’s Global Engagement Center:

  • First, analyze. We believe strongly that it is vital to understand Russian tactics and goals if we are to address them. The GEC has invested heavily in capabilities that allow us to answer three core questions: Who are the Russians targeting? How are they targeting these people? And how effective are their actions? We answer these questions by combining traditional market research approaches like focus grouping and polling with modern techniques that rely on machine learning to understand the online information environment.
  • Second, build. Once we better understand these tactics and goals we can address
    them. This often starts with building the capability of our foreign partners to
    quickly identify disinformation and respond effectively. Currently we are
    supporting both international initiatives that include foreign governments as well
    as on-the-ground civil society actors. For civil society actors, the GEC has funded an implementer to train civil society actors in 14 European nations. The training enables the civil society organizations to help their communities rapidly identify and respond to disinformation in locally relevant ways.
  • Third, communicate. Russian disinformation often takes advantage of information vacuums. Together with our partners, we must fill the information space with positive, fact-based narratives. Congress provided the GEC with an important tool to meet this need – the ability to hire private sector advertising and marketing firms. We know what story we want to tell. It is crucial to have local communications professionals help us tailor that story to local audiences. They understand the market, and they understand how to message to the market in the most appealing fashion. 

The U.S. Congress and the administration should adopt several measures to counter Russia’s political warfare, said Polyakova, co-author with the Atlantic Council’s Daniel Fried, of a recent report (above) – Democratic Defense Against Disinformation 2.0. The U.S. Congress should:

  • authorize and appropriate funds to “build capacity of civil society, media, and other nongovernmental organizations,” countering Russian and other sources of foreign disinformation, in coordination with the EU, NATO, and other bodies.
  • authorize and appropriate funds to establish a “fusion cell” or NCTC-style model
    for coordinating U.S. government efforts on disinformation. The Cell could be housed in DHS, State, or elsewhere. There is more than one option for structuring an interagency response.
  • authorize and appropriate funds to further develop Current Time (right) to allow Current Time to broadcast and build audiences in Central Eastern Europe and the Balkans with potential expansion further into Western Europe.
  • develop in-house expertise on disinformation and digital media. Congress’s capacity
    for detailed analysis, independent from social media companies, will be critical.
  • prepare legislation—on a step-by-step basis—to support a regulatory framework
    for social media companies. This layered approach should start with greater Congressional scrutiny around all online advertising—an industry that is largely unregulated.
  • consider legislation to provide a framework for regulation to address transparency (especially with respect to bots), integrity and authenticity of service (i.e. targeting deceptive and impersonator accounts, whether individuals or false-front organizations), and common terms of service across the social media industry.


Print Friendly, PDF & Email