There is nothing new about either fake news or Russian disinformation campaigns. Back in 1983, at the height of the cold war, an extraordinary story appeared in a little-known pro-Soviet newspaper called the Patriot. It claimed to have evidence that the Pentagon had deliberately created AIDS as a biological weapon and was ready to export the virus to other countries, mainly in the developing world, as a way of gaining control over them. Within a few years the story had reappeared in mainstream publications in more than 50 countries, The Economist explains in a must-read Special Report:
It is not just Russia that conducts IO [information operations] against other countries. Jihadist extremists and hacker groups employed by rogue states or criminal networks pose similar if lesser threats. And although the big social-media companies now claim to be working on solutions, including better and quicker attribution of messages, Russian IO techniques are bound to adapt accordingly.
“[W]hen target forces start to counter these [Russian] efforts and/or expose them on a large scale, the Russians are likely to accelerate the improvement of their techniques…in other words, an information-warfare arms race is likely to ensue,” said Rand Waltzman, a former program manager at America’s Defense Advanced Research Projects Agency (DARPA) and now at the RAND Corporation.
Kremlin disinformation has marred the Czech Presidential election campaign, which pits pro-Putin illiberal Milos Zeman against challenger Jiri Drahos, The Washington Post adds:
Yet in an indication of just how difficult it will be to defeat the incumbent, Drahos has been subjected in recent days to “a flood of false stories” online, said Veronika Vichova, an analyst with the European Values think tank. The unsubstantiated allegations include that Drahos is planning to open the country to a wave of Muslim refugees and that he is a pedophile. Vichova said the origin of the disinformation campaign is unclear. But the president’s close ties to Moscow have focused suspicion on Russia.
One remedy for dealing with Russian disinformatsia tradecraft is fairly obvious, according to Aviezer Tucker, the author of The Legacies of Totalitarianism, and Adam Garfinkle, editor of The American Interest: What they do to us (and others), we can do to them (and others) in spades, the idea being to exact a price and then cash it in as a form of deterrence, they write for The American Interest:
Some in our government have proposed such ideas, and in readiness against extreme circumstances having such capabilities at the ready may be prudent. We wouldn’t even have to make stuff up; there are enough real facts about Putin’s kleptocratic ways and his easy associations with a who’s who of Chechen thugs that inventing stories about his running a pedophile ring manned by ex-KGB operatives from a pirogi shop in Moscow simply isn’t necessary. But short of extremities, it’s a bad idea. One does not clean a house by adding filth hither and yon, after all; defense without the threat of offense remains possible.
The apparent lack of US preparation and defense nearly eighteen months after Russia’s interference in the presidential elections, especially given numerous media reports that Russia aims to interfere in the 2018 US midterm elections, is deeply troubling, according to the Atlantic Council’s Daniel Fried [right, a board member of the National Endowment for Democracy] and Brian O’Toole, a nonresident senior fellow with the Atlantic Council’s Global Business and Economics Program.
“We are heartened that Congress has taken up leadership to defend the US electoral process. But notwithstanding its good intent and timeliness, the Defending Elections from Threats by Establishing Redlines (DETER) Act of 2018, recently introduced by Sens. Marco Rubio (R-FL) and Chris Van Hollen (D-MD), is pursuing the right thing in the wrong way,” they contend. RTWT
A new report from the New America Foundation examines “precision propaganda,” demonstrating how a toolbox fashioned for the advertising industry is easily repurposed by even modestly competent actors to spread disinformation.
In Digital Deceit: The Technologies Behind Precision Propaganda on the Internet (above), analysts Dipayan Ghosh and Ben Scott analyze digital advertising and marketing technologies in order to deepen our understanding of precision propaganda, including:
* Behavioral data tracking. The lifeblood of digital advertising and marketing is data. An array of technologies operate unseen to the user, capturing every click, purchase, post and geolocation. This data is aggregated, connected with personal identifiers, and built into consumer profiles. This data helps disinformation operators derive the precisely targeted audiences that respond to particular messages.
* Online ad buying. The market for targeted online advertising drives sentiment change and persuasion. The most sophisticated systems enable automated experimentation with thousands of message variations paired with profiled audience segments. This precision advertising helps disinformation to reach and grow responsive audiences and to drive popular messages into viral phenomenon.
* Search engine optimization (SEO). There is a multi-billion dollar industry dedicated to optimizing search engine results by reverse engineering the Google search page rank algorithm. Disinformation operators use techniques known as “black hat SEO” to trick the algorithm and dominate search results for a few hours of the news cycle before Google corrects the distortion.
* Social media management services (SMMS). A new kind of digital marketing company sits at the intersection of machine learning algorithms and advertising technology. The SMMS offers advertisers a fully-integrated solution that pre-configures messages for different target audiences across multiple media channels simultaneously and automatically. It is a finely tuned disinformation machine for the precision propagandist.
* Artificial intelligence in marketing. Machine learning algorithms are already integrated into targeted advertising platforms and complex data analytics. Stronger forms of AI will be available in the near term. These advances will greatly increase the potency of disinformation operations by enhancing the effectiveness of behavioral data tracking, audience segmentation, message targeting/testing, and systemic campaign management.
When we set out to cover disinformation, we gave ourselves the daunting goal of moving away from traditional “he said/she said” journalism and instead focusing on real people and how they are affected by falsehoods and information wars, notes Natalia Antelava,
Coda Story’s CEO and editor-in-chief:
The story of Anastasia is a perfect example: the 12-year-old from St. Petersburg was invited to take part in a popular match-making program along with her father, on Russia’s biggest TV network Channel One. But the hosts broke with the script and started to bully Anastasia, first in the studio, then online. When she fought back with a social media campaign against hate speech, she was forced to take on not only Russian state television, but also US internet giant, Youtube.
Writing for the Carnegie Endowment, Tim Maurer finds that as cyberspace emerges as the new frontier for geopolitics, states have become entrepreneurial in their sponsorship, deployment, and exploitation of hackers as proxies to project power.