America’s top social-media companies — or more precisely, their lawyers — testified before Congress this week, two things became clear. First, the problem is much greater than previously admitted. And second, the companies in question have little incentive to solve it, Bloomberg reports.
Contrary to common sense, technology platform companies, including Facebook, Google, Twitter and hundreds of their peers, do not know—nor will they know in the future—all of the myriad ways their platforms will be used by millions of users, notes Timothy Carone, a professor at Notre Dame’s Mendoza College of Business. What should be expected going forward? Two things are clear, he writes for CNBC.
- First and most importantly, there will continue to be creative new uses to these platforms in ways that we cannot predict. Countries and organizations that we view as our adversaries see these platforms as our Achilles’ heel. They will continue to evolve their use of the platforms to destabilize democracies and nonprofit organizations worldwide given their successes to date. The old attack vectors of fake news and buying ads will be replaced by new attack vectors.
- Second, much of this new use will be created and carried out by artificial intelligence software rather than humans sitting around coming up with the ideas. AI has become a useful tool because of cheap processing power, but also because of the almost limitless supply of data that can be used to train the AI software to perform specific tasks.
Tech companies have taken a pounding in the court of public opinion in recent months. In the eyes of their critics, they have become too big, too powerful and too unmindful of their influence. And this week’s congressional hearings cast added and unflattering light on the industry’s growing embarrassment over the Russian election meddling, the New York Times adds.
“Without sufficient oversight, these companies never imagined hostile intelligence services would misuse their platforms in this way,” said Renee DiResta, an independent security researcher at Data for Democracy. “The people running it appear to not fully appreciate what they’ve designed.”
A former top adviser to Russian President Vladimir Putin isn’t sure how much influence Putin had over the Kremlin’s disinformation campaign during the 2016 U.S. presidential election. But in an interview on PBS’s “Frontline,” former Kremlin adviser Gleb Pavlovsky (above), said Putin believes he influenced the result, The Hill reports.
Russian efforts to influence the election represented “the most recent expression of Moscow’s longstanding desire to undermine the US-led liberal democratic order” and aimed “to undermine public faith in the US democratic process,” according to a declassified version of a highly classified assessment from the U.S. intelligence community.
Authoritarian regimes like Russia and China are outspending the United States in the realm of soft power, Senator Chris Murphy (D-Conn.) told last night’s National Democratic Institute annual Democracy Dinner at the Fairmont Hotel in Washington, D.C.
“Our budget is $650 million—a fraction of what our adversaries spend,” he said “Today, Russia is spending over a billion dollars on covert propaganda operations,” he added. “Russian TV, radio, and internet bots continue to push misinformation without almost no pushback from the US.”
Disinformation is one aspect of a classic Russian trick: strategic deception, according to analyst John Sipher.
“Lack of public awareness about this part of the Kremlin playbook threatens to unravel whatever traction we gain in finding the truth about 2016 and in defending ourselves against current threats and ones over the horizon,” he writes for Just Security:
Strategic deception is a secret, offensive effort to create an alternative narrative that serves Moscow’s interests. Unlike Russia’s fake news and disinformation efforts designed to confuse or meet tactical ends, strategic deception is designed to build a believable and consistent narrative forcing the recipient to take a specific action.
The aim of pro-Kremlin disinformation is to challenge democratic values, notes the Disinformation Review, which lists seven key aspects of Russian disinformation:
Sowing distrust in the independence of journalism and media as such is another key message in pro-Kremlin outlets. This message makes their own bias and disinformation look normal, and at the same time, the message weakens targeted societies, as it challenges the idea that media can pursue truth on behalf of all of society and make society a better place. RT (Russia Today) leads by example: Investigations into RT’s target audiences have shown that these at one point even begin to appreciate RT being “open about lying.”
Ukraine was the first victim of this new generation of warfare characterized by “ambiguity, proxy forces, hybrid forces, significant disinformation and psychological operations campaigns,” according to retired General Jack Keane.