Vladimir Putin’s Russia is engaged in a well-financed and determined campaign to undermine democratic political and social institutions as well as international alliances, and to remove resistance to Russia’s foreign policy objectives. Russia has the motive and the means to do so, according to a new report from the Pell Center at Salve Regina University.
We are witnessing an extension of Cold War tactics and the U.S. should treat it as a new chapter in that conflict, according to Georgetown University professor Mark Jacobson, one of the authors of “Shatter The House Of Mirrors.”
“What Russian policymakers want to do is undermine confidence in American democracy but not confidence necessarily in the sense of just elections or processes or systems,” he told NPR. “What they want to do is encourage division with the hopes of not just prolonging that division, but they hope that it turns violent. And what this does – it forces the United States to look inside and not concern itself with what’s happening across the globe.”
The Pell Center report makes several specific recommendations:
- Improve transparency and raise public awareness of the threat. Specifically, the Pell Center authors call for the appointment of an independent bipartisan commission to establish a widely-accepted understanding of Russia’s actions, means, and objectives in the 2016 U.S. election. Specifically, the study highlights the need for a public accounting of irregular social media activity in battleground states prior to the 2016 election as well as on-going social media efforts to sow division in the United States….
- Prepare the executive branch for a new cold war. Organizations from the White House to the intelligence community need to be reviewed for their efficacy in meeting the propaganda challenge to the West, according to Ludes and Jacobson. The White House must communicate to Congress the need for any new authorizations to meet this threat. It must also request sufficient appropriations for these activities and prosecute these programs vigorously. …
- Congress must lead. Ludes and Jacobson argue that in the absence of clear executive branch willingness or readiness to lead on this issue, the U.S. Congress must take the initiative. It can do so by eliminating “dark-money” in American politics; requiring more transparency by corporations operating in the United States; embracing bipartisanship in the defense of American democracy; and reforming the laws governing the activities of foreign agents operating in the United States—to begin by considering legislative changes that would require state-sponsored media outlets, such as RT and Sputnik, to publicly reveal their sources of funding.
- Invest in the American people. Finally, the authors of the Pell Center conference report urged the public to once again consider education a national priority and the cornerstone for an effective defense of democracy. Russia, they argue, “exploited America’s media illiteracy, our civic illiteracy, and our historical illiteracy.” The Pell Center study calls for increased funding for programs to increase the public’s resistance to influence by foreign powers. RTWT
Kremlin-backed disinformation and influence operations go far beyond any one election, targeting vulnerabilities and taking advantage of the U.S.’s free and open cyberspace to apply influence and play on weaknesses in the information sphere, experts told The Cipher Brief:
“President Putin holds a black belt in judo, a key principle of which is to use an opponent’s strength against them. Our core strength as a country derives from the First Amendment, freedom of the press, liberty, and our democratic institutions. We are inherently vulnerable to cyber influence operations including disinformation which can gain traction in our free and open cyberspace,” Hoffman, a Cipher Brief expert, said.
“The Soviets used to direct their sources to write covert influence newspaper articles, but in today’s interconnected cyber space, the Kremlin can exploit the internet’s instantaneous, asymmetric force multiplier to its benefit,” he said.
If the diagnosis about Russian interference is growing in scope and precision, the debate about policy prescriptions for protecting our sovereignty in future elections and every day political life has barely begun, notes Michael McFaul, director of the Freeman Spogli Institute for International Studies and a Hoover fellow at Stanford University. Actual policy actions to protect our vote from outside interference have been next to nil. That needs to change now, he writes for The Washington Post:
- First, and most obviously, our cybersecurity must be strengthened. We need greater education on how to prevent cyberattacks; more coordination between layers for cybersecurity at the individual, group and government levels; and new government regulation mandating upgrades in cybersecurity for everyone and everything involved in the electoral process. Deterrence also must be a component of our response: direct, private communications to the Kremlin and other foreign governments warning of our intended responses — in both the cyber and real worlds — to future attacks. Until security and confidence are enhanced, every state also must collect paper ballots to back up electronic vote counts.
- Second, information about Russian state propaganda — not censorship of these content providers — must be provided to the American people. Viewers of RT, formerly called Russia Today, on YouTube or readers of Sputnik on Twitter need to know that the Russian government is providing this content to advance the Kremlin’s political objectives. This task could be achieved in two ways. Private actors — cable companies and social-media platforms – could do the identification, as some already have started to do regarding disinformation. Or the U.S. government could require these foreign agents of influence to register under the Foreign Agents Registration Act (FARA). ….
- Third, foreign purchase of advertisements aimed at influencing elections must be prohibited. Just as foreigners cannot contribute to American candidates, they should not be able to purchase or provide in-kind support for candidates or parties. Existing laws and regulations must be enhanced to compel American companies to stop this activity, even if the use of VPNs and third-party cutouts make the task challenging. Regulation of this market for Americans, however, must be avoided.
First Draft has been asking ourselves this question since the French election, when we had to make difficult decisions about what information to publicly debunk for CrossCheck. …As Alice Marwick and Rebecca Lewis noted in their 2017 report, Media Manipulation and Disinformation Online, “[F]or manipulators, it doesn’t matter if the media is reporting on a story in order to debunk or dismiss it; the important thing is getting it covered in the first place.” Buzzfeed’s Ryan Broderick seemed to confirm our concerns when, on the weekend of the #MacronLeaks trend, he tweeted that 4channers were celebrating news stories about the leaks as a “form of engagement.”
We have since faced the same challenges in the UK and German elections. Our work convinced us that journalists, fact-checkers and civil society urgently need to discuss when, how and why we report on examples of mis- and dis-information and the automated campaigns often used to promote them. Of particular importance is defining a “tipping point” at which mis- and dis-information becomes beneficial to address. We offer 10 questions below to spark such a discussion.
DISINFORMATION VS. DEMOCRACY: FIGHTING FOR FACTS
NDI’s annual Democracy Dinner will be held November 2nd at the Fairmont Hotel in Washington, DC. This year, NDI [a core institute of the National Endowment for Democracy] will honor three organizations on the front lines of fighting the global challenge of disinformation and false news, the National Democratic Institute adds:
Disinformation in politics – particularly elections – represents a critical threat to the US, to our allies and to democracy itself. The global reach of social media platforms, coupled with the rise of artificial intelligence and machine learning, has provided a powerful suite of new tools that are increasingly used by autocratic regimes seeking to control the information space. While control of information has long been a key feature of autocracies at home, the rise of social media platforms and online political discourse now provide new opportunities for autocracies to manipulate public opinion abroad and disrupt domestic politics in geopolitical adversaries. The weaponization of social media is a global challenge, both during and between elections. Emerging democracies have often been used to “weapons test” new approaches to computational propaganda and disinformation, and the work being done to counter it is critical to the future of democracy.
At the dinner, NDI will recognize three organizations that have demonstrated a deep and abiding commitment to democracy and human rights:
StopFake.org – StopFake.org in Ukraine works with journalists and citizen groups to monitor and uncover false news sources, and has created tools on “how to identify a fake” on its website. It checks facts and verifies information in media to help consumers obtain objective news that is free from distorted information, specifically on events in Ukraine. Long before most were aware of the use of false media to manipulate public opinion, StopFake was on the front lines exposing these tactics in a very tough neighborhood in which Ukraine is facing lots of outside pressure….. Margo Gontar, who is accepting the award on behalf of the organization, is a co-founder and editor of StopFake and a TV host of the weekly news digest “StopFake News”.
Rappler – Rappler is an online social news network based in the Philippines. It holds public and private sectors accountable, pursuing truth and transparency for the people served. It encourages its readership to be aware of the spread of disinformation and propaganda, and exposes the hidden social media “machines” or bots that distort the truth. Rappler has suffered threats and severe internal pressure for its pioneering work in exposing disinformation and propaganda in the Philippines to manipulate public opinion. The story of Rappler shows how the use of disinformation and computational propaganda are bleeding over to domestic actors in new and consolidated democracies, resulting in democratic backsliding. Attacks on Rappler’s founder also demonstrate the particularly vicious ways in which disinformation has been used to attack women who are active in political life. Maria A. Ressa, who is accepting the award on behalf of the outlet, is the CEO and Executive Editor of Rappler, and is a former CNN bureau chief and investigative reporter.
The Oxford Internet Institute’s Project on Computational Propaganda – The Oxford Internet Institute (OII), a multi-disciplinary research teaching department of the University of Oxford, has been at the forefront of research in the field of disinformation. In early 2017, OII’s Project on Computational Propaganda issued a groundbreaking study on the use of social media and computational propaganda to manipulate public opinion in nine countries… The launch of these case studies in Washington DC, was held at NDI’s offices in June, as part of #DisinfoWeek, a series of events to share information on this issue. The #DisinfoWeek hashtag reached over 7 million unique users. NDI also filmed 7 Facebook Live interviews on the sidelines of the Digital Disinformation Forum held in partnership with Stanford University’s Center for Democracy, Development and the Rule of Law. The videos were viewed over 7,000 times that week for a total of 2,800 minutes viewed.
The dinner will include remarks from Senator Chris Murphy, a leader in the U.S. Congress on efforts to counter global disinformation and propaganda, especially efforts by Russia. Please join us November 2nd to learn more and to recognize global leaders in the fight against disinformation. For more information, please contact Kirsten Tallon (email@example.com or 202.728.5483).. To purchase tickets and sponsor the event, please use this link.