Russia has re-ignited its ideological battle with the West by waging a more insidious offensive: intrude on Western democratic processes and meddle in nations’ politics, according to Digital Warfare: Russia’s Attacks on Democracy, a new analysis from the Center for Strategic and International Studies.
We’re at war,” says Heather Conley, author of “The Kremlin Playbook,” and director of the CSIS Europe Program. “We just don’t know it.”
Digital disinformation poses a grave threat to democracy and requires a new social contract between consumers and internet firms that is based on transparency, privacy and competition, according to a new report co-published by the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy and New America, the Washington, D.C.-based think tank. The codification of digital rights into public law would encompass regulations designed to advance democratic values and protect citizens from disinformation while fostering open digital markets, says the report, “Digital Deceit II: A Policy Agenda to Fight Disinformation on the Internet.”
“The digital economy too often serves to enhance social division by feeding pre-existing biases, affirming false beliefs, and fragmenting media audiences,” said Dr. Dipayan Ghosh, co-author of the report and Pozen Fellow at the Shorenstein Center.
“The enormous power of digital technologies to reshape modern social, economic and political life has delivered unquestionable benefits to the public,” noted co-author Dr. Ben Scott. “But the absence of clear laws to steer this transformation towards the common good leaves us vulnerable to exploitation and threatens to undermine the integrity of our democracy.”
Russia does have the capacity to change electoral outcomes, according to “Cyberwar: How Russian Hackers and Trolls Helped Elect a President—What We Don’t, Can’t, and Do Know,” by Kathleen Hall Jamieson, a “scrupulously nonpartisan” professor of communications at the University of Pennsylvania, who directs the Annenberg Public Policy Center, at Penn, and co-founded FactCheck.org, a nonpartisan watchdog group. She is widely respected by political experts in both parties,
Her case is based on a growing body of knowledge about the electronic warfare waged by Russian trolls and hackers—whom she terms “discourse saboteurs”—and on five decades’ worth of academic studies about what kinds of persuasion can influence voters, and under what circumstances, Jane Mayer writes for The New Yorker:
Democracies around the world, she told me, have begun to realize that subverting an election doesn’t require tampering with voting machines. Extensive studies of past campaigns, Jamieson said, have demonstrated that “you can affect people, who then change their decision, and that alters the outcome.” She continued, “I’m not arguing that Russians pulled the voting levers. I’m arguing that they persuaded enough people to either vote a certain way or not vote at all.”
Through strategically-timed and –targeted interventions, the Kremlin has “reweighted the news environment” to favor some candidates over others, Jamieson contends.
Philip Howard, the director of the Oxford Internet Institute, in England, agrees that the Russian interference could have been decisive, but he is less convinced that stolen analytics were key… [and] other academics may also be skeptical of “Cyberwar,” Mayer adds:
A forthcoming book on the 2016 campaign, “Identity Crisis,” by the political scientists John Sides, Michael Tesler, and Lynn Vavreck [who analyzed the election for the National Endowment for Democracy’s Journal of Democracy], argues that Russian interference was not a major factor in the Presidential election, and that the hacked e-mails “did not clearly affect” perceptions of Clinton. ….Recently, Brendan Nyhan, a professor of public policy at the University of Michigan, suggested, in the Times, that most fears about the impact of Russian information warfare in the 2016 campaign are exaggerated. He wrote that “a growing number of studies conclude” that “most forms of political persuasion seem to have little effect at all.”
Dystopian digital future of fake media
The democratization of access to sophisticated digital-imaging technologies, powerful machine-learning algorithms, and unprecedented computing power have made it easier to create sophisticated and compelling fakes, notes Hany Farid, a computer science professor at Dartmouth College. Advances in digital imaging are allowing digital photographs, videos, and audio recordings to be altered in ways that would have been unimaginable 10 years ago, he writes for Quartz:
Imagine a world in which we can no longer trust or believe news reports of global conflict, social uprisings, police misconduct, or natural disasters. Imagine a world in which we can no longer believe what our world leaders say in public—or private. Imagine a world in which we simply cannot separate fact from fiction. In this world, how will we function as a democracy, economy, or society?
A new MPR-Marist poll revealed that 63 percent of Americans believe keeping U.S. elections safe and secure is a “top priority,” and 53 percent believe the country is either “prepared” or “very prepared” to keep the midterms secure, the Alliance for Securing Democracy reports. However, 67 percent of respondents said it was “very likely/likely” that Russia would use social media to spread false information about candidates running for office; and that Facebook and Twitter had done “not very much/nothing at all” to prevent election interference this year.
What are the main lessons learnt from countering Russia’s disinformation activities in Georgia? Kremlin Watch asks. This question is the focus of a new publication authored by Kremlin Watch Special Fellow Tornike Zurabashvili:
Mere political acknowledgment of the threat is necessary but insufficient for successfully countering pro-Kremlin disinformation and hostile influence efforts. Including civil society organizations in counter-efforts is crucial, but it would be a mistake to rely on them alone. Religion also plays an important role – the clergy should be involved in countering disinformation efforts both as a target group and as a medium for delivering fact-based messages. Adopting a broad, whole-of-society strategy, complementing existing government and civil society countermeasures, is recommended.
You can read the full publication here.
The Shorenstein-New America report outlines a sweeping policy framework that would address the digital threat to democracy, focused on three key principles:
- Ad Transparency – As citizens, we have a right to know who is trying to influence our political views and how they are doing it. There must be explicit disclosure about the operation of such advertising and the content curation processes on dominant digital media platforms. We must have disclosure in the form of real-time and archived information about targeted political advertising, clear accountability for the social impact of automated decision-making, and explicit indicators for the presence of non-human accounts in digital media.
- Privacy – As individuals with the right to personal autonomy, we must be given more control over how our data is collected, used and monetized, particularly when it comes to sensitive information that shapes political decision-making. A baseline data privacy law must include consumer control over data through stronger rights to access and removal, transparency for the user of the full extent of data usage and meaningful consent, and stronger enforcement with resources and authority for agency rule-making.
- Competition – As consumers, we must have meaningful options to find, send and receive information over digital media. The rise of dominant digital platforms demonstrates how market structure influences social and political outcomes. A new competition policy agenda should include stronger oversight of mergers and acquisitions, antitrust reform and robust data portability and interoperability between services.