Countering disinformation: three levels of action

     

Facing one of the clearest domestic threats to the U.S. in a decade, neither the F.B.I., which has the responsibility for conducting counterintelligence inside the United States, nor the O.D.N.I. warned Americans that platoons of Russian-backed automated “bots” and human trolls were working online to amplify racial divisions and anti-government conspiracy theories, Dana Priest writes for The New Yorker:

The F.B.I. deputy director, Andrew McCabe, admitted in a CNBC interview on October 4th that the U.S. intelligence community “should have predicted” the attacks “with more clarity, maybe, than we did.” “When you overlay these attacks onto what we’ve known on our counterintelligence side about the Russians for many years, it completely fits into their playbook,” he went on. “This ability to insert themselves into our system, to sow discord and social and political unrest, is right up their alley, and it’s something we probably should have seen.” In a recent interview, a senior intelligence official who was given permission to speak with me, agreed. “He’s spot-on,” the official, who asked not to be named, said of McCabe.

Among the first to document Russia’s online disinformation tactics was Olga Yurkova, a thirty-two-year-old journalist who recently graduated from the National University of Kyiv-Mohyla Academy School of Journalism, Priest adds:

On March 1, 2014, Yurkova watched on television and online as armed men in unmarked uniforms occupied Crimea. Russian media named them “polite people.” Yurkova and her university colleagues, steeped in previous Russian disinformation operations in the Baltics and elsewhere, knew better.

“Their lies were so blatant that all Ukrainian journalists were speechless with shock,” Yurkova told me from Kiev. “As responsible journalists, we had to do something with this.” The following day, Yurkova created a Web site called StopFake.org, which is dedicated to debunking fake news and identifying Russian disinformation. ….“We have been working for three years to inform very diverse people about why they should consider this problem, how they can reduce the impact of propaganda, and what are the possible ways of countering propaganda as a phenomenon,” Yurkova said

Looking forward and building on past improvements, there are three levels of action that can improve the cybersecurity of the nation’s democratic processes, according to Vikram J. Singh, senior advisor for national security, democracy, and technology at the Center for American Progress, and Jonathan Reiber, a visiting scholar at the University of California Berkeley’s Center for Long-Term Cybersecurity.

Although strong White House leadership is needed in some cases, it is also unlikely to materialize. Private organizations, federal, state, and local government, and individuals can nonetheless make meaningful progress. It’s time to start thinking of cybersecurity not only as a problem for the tech community, but as a required practice of businesses and civil society and a duty of citizenship for every American, as normal as wearing a seatbelt or taking part in the neighborhood watch, they write for Foreign Policy:

  • First, companies and non-profits as well as state and local governments can organize and invest for cybersecurity. Companies and organizations need to invest in the people, processes and technology necessary for effective cybersecurity, from firewalls to two-factor authentication to constant red-teaming of exploitable business practices, like Facebook’s previous ad-selling method that enabled Russian propaganda. ….States have the Constitutional right to set the “times, places, and manners” of elections and have largely managed electoral processes on their own. Yet things are not working as they should. According to a Harvard study by Pippa Norris and others, in recent years the United States electoral system has displayed the “worst performance among all Western democracies”…..For its part, Congress should still establish an independent commission on election interference to understand threats and develop recommendations. Legislation could also offer incentives to spur cybersecurity spending through tax credits or reductions, potentially winning administration support. One of the more promising civic initiatives, Harvard’s Defending Digital Democracy initiative, brings together a bipartisan group of national security and political leaders with technology company experts to identify solutions. It could hold events across the country to help secure democratic institutions. Other similar programs could do the same.
  • The second layer of action is about preparing to counter cyberthreats from abroad. When a foreign government attacks the United States in cyberspace, the intelligence community and the U.S. militaryhave the lead in identifying the adversary’s identity and infrastructure, for preparing to counter incoming cyberattacks, and respond in kind, if directed. For this mission to succeed it is vital that companies, state and local government, and individuals share information as quickly as they can with federal partners. The information-sharing process needs to improve for all parties: during the Russia attack, the FBI warned the Democratic National Committee, but neither organization behaved effectively; the DNC was unsure if the warning was real and failed to act on it, and the FBI failed to ensure that its warning reached the right people. The U.S. government is focused on Russia, North Korea, Iran, and China and other cyber aggressors. It can help victims not only by rapidly sharing threat information but also by ensuring warnings are acted on and, where needed, organizing response operations to stop an attack, control escalation, and reestablish deterrence with a hostile state….
  • Third and finally, the United States needs a national campaign to ensure the American people treat the internet as a risky environment that demands common-sense precautions. We the people, the users and consumers of social and mainstream media, need to get educated to defend ourselves against hackers and know when we might be getting played in cyberspace, whether by a foreign power, a domestic group, or a cyber criminal. Russian propaganda reached scores of Americans unaware they could be targeted by information warfare.

Oligarchs and others working with the Kremlin to advance aggressive foreign actions, such as organizing mercenary forces in Ukraine and Syria, or advancing cyberwarfare or disinformation should be included in lists of Kremlin officials and members of Russia’s ruling elite eligible for personal sanctions, say analysts Anders Åslund, Daniel Fried, Andrei Illarionov and Andrei Piontkovsky.

With the U.S. midterm elections fast approaching, the threat is as urgent as ever, according to Laura Rosenberger and Jamie Fly of the Alliance for Securing Democracy. The U.S. government, the private sector, and civil society need to begin immediately to develop and put strategies in place to defend against, deter, and raise the costs of any attempts to undermine American democracy. And that means asking some difficult questions, they write for War On The Rocks:

  • First, how did the government and social media companies miss the exploitation of these platforms by a foreign government intent on undermining American democracy, and what are these companies doing to make sure it does not happen again? Technology has empowered individual citizens, fueled revolutions against authoritarian regimes, and provided access to information and tools that has increased the quality of life for Americans. Yet Americans’ increasing reliance on technology to conduct political discourse has a dark side that this increasingly social media-addled society has failed to grasp. The unilateral steps announced in recent weeks by social media companies are a good first step, but do not go far enough to address the fundamental vulnerabilities of many of these platforms.
  • Tech companies need to work with national security experts to better understand the potential vulnerabilities of their platforms, especially as technology evolves through the use of artificial intelligence, in which countries like Russia and China are investing heavily. Social media companies must understand that having an honest public conversation and taking smart steps now to regulate these technologies will help head off government overreach in response as the problem gets worse. This should include ensuring that online political advertising is subject to the same rules as advertising on other mediums. In addition, significant effort needs to be put into teaching media literacy to children and young people who have never learned to distinguish trusted news from information with an agenda.
  • And while the focus is now on social media, journalists and reporters for more traditional media outlets need to examine their role in a disinformation environment. Traditional media reported eagerly on weaponized information stolen from the Democratic National Committee and John Podesta by Russia and released for the purpose of interfering in the Presidential election, and often did so without providing readers context on the agenda behind the leaks or stopping to verify the information.

U.S. officials must implement a public awareness campaign that can transform the American public, currently vulnerable to fake news and Russian influence, into a resilient populace aware of the growing disinformation threat, say FPRI analysts Eriks K. Selga and Benjamin Rasmussen. The positive effects of implementing these tactics are two-fold:

  • First, they would increase widespread public recognition of the problem and organically increase the filtering capabilities of Americans.
  • Second, they would have positive spillover effects on the West’s overall ability to defend against disinformation assaults—steering allied states towards a stronger, more cohesive partnership against a hybrid threat without precedent. High-level recognition alone may not fully solve the disinformation problem, but it would be a powerful step for the United States and its Western partners.

The European Commission today launched a public consultation on fake news and online disinformation and set up a High-Level Expert Group representing academics, online platforms, news media and civil society groups. The work of the High-Level Expert Group as well as the results of the public consultation will contribute to the development of an EU-level strategy on how to tackle the spreading of fake news, to be presented in spring 2018.

The EU’s Cyber Security Strategy – updated in September 2017 – is intended to improve the protection of Europe’s critical infrastructure and boost the EU’s digital self-assertiveness towards other regions of the world, says a new analysis from the German SWP foreign policy think-tank:

But the reformed strategy leaves open a number of questions as to how its objective of an “open, safe and secure cyberspace” will be credibly defended, both internally and externally. The EU has neither properly defined resilience or deterrence nor made sufficiently clear how it intends to overcome institutional fragmentation and lack of legal authority in cybersecurity issues. Moreover, controversial topics – such as the harmonisation of criminal law or the use of encryption – have been entirely omitted. Member states should abandon their standalone efforts and speed up the legal regulation of cybersecurity at the EU level.

A recent brief from the National Endowment for Democracy s International Forum for Democratic Studies outlines which factors set disinformation apart from other forms of manipulative or persuasive content and why today’s information environment amplifies disinformation

Print Friendly, PDF & Email