Social media platforms are failing to make the changes that would help curb online disinformation and fake news despite the efforts made since the 2016 US presidential election exposed the extent of manipulation, researchers say. False claims and automated bot activity on the internet in the wake of the recent Florida school shooting have highlighted persistent online attempts to amplify divisions in US society over issues ranging from mass shootings to social justice campaigns and elections, The Financial Times reports:
Facebook has made newsfeed posts more personal in an effort to create “meaningful interactions” and move away from clickbait, including fake news. But disinformation experts are urging companies to do more, outlining potential measures ranging from removing more bots to slowing down product development. Facebook’s mechanisms have been consistently gamed or manipulated by bad actors.
The Alliance for Securing Democracy, the think-tank, said there was still “low-hanging fruit” for tech companies. The alliance runs Hamilton68, an online dashboard whose stated aim is to track “Putin’s propaganda push” on Twitter to catalogue messages broadcast by the Russian disinformation campaign.
While it was hard to distinguish a human troll from a “person in their basement who wants to tweet 400 times a day”, the platforms should get better at using automation to detect bots, he the Alliance’s Brett Schafer said.
It’s time to start shredding the Putin Playbook, Laura Rosenberger and Jamie Fly write for Democracy.
What is Jigsaw doing to frustrate online trolls? MIT Technology Review asks:
We have a team dedicated to seeing how natural-language processing and machine learning can be used to identify online toxicity and help moderators and communities in tackling it, says Yasmin Green (right), director of research and development at Jigsaw, an arm of Google’s parent company, Alphabet. . There’s a model we’ve developed that’s publicly available, called Perspective, and you can find it at www.perspectiveapi.com. This helps you score comments for their level of toxicity. The research team is looking at ways to get to another level of granularity to help us better identify what’s happening and how moderators can control it.
“We’ve been looking at the exercises the Russians were doing in terms of disinformation and misinformation to shape that environment for years,” says Scott Carpenter, a former State Department official who now serves as Jigsaw’s managing director and also acts as a liaison with teams at Google and other Alphabet subsidiaries, FastCompany reports:
Still, it wasn’t until October 2016 that the group assisted in launching a tool specifically to address fake news in the United States. Called Fact Check, the service is embedded in Google News and labels articles using criteria such as whether they include opinion or highly cited reporting. “Before the election, people were like, ‘What the fuck do you need a fact checker for?’” Carpenter recalls. “And then they were like, ‘Oh my god, we have Fact Check! Look! We did it! Google! In Search! Before the election!’”
The website they set up to track the Russian campaign, called Hamilton 68 after the eloquent warning in the Federalist Papers from Alexander Hamilton to beware of foreign efforts to hijack U.S. elections, has documented in recent days how even an American tragedy like the Florida school shooting has become fodder for the network of 600 Russia-linked social media accounts the group’s software is following. The trolls, Fly said, were pushing inflammatory pro- and anti-gun control messages within hours of the killings, with “Russian accounts that were jumping on both sides, basically egging Americans on, making everyone angrier, trying to divide us rather than bring us together in a moment of crisis.”
“But now the group, whose formal title is the grand but vague Alliance for Securing Democracy, is itself part of the backlash, reflecting the weird new alliances and strange political partnerships the Kremlin has managed to generate in American politics,” Glasser adds. RTWT
So far, Western statesmen, editors, and journalists have responded to Russian propaganda defensively: pointing out lies, rebuffing accusations, disclosing hidden motives, and demonstrating the ugliness of the Russian regime. But while such responses are natural, they are also by nature reactive, and risk helping the Kremlin by reinforcing its messaging, , notes Vasily Gatov, a Visiting Fellow at the USC Annenberg Center on Communication Leadership and Policy.
We need to move from reaction to a positive approach, which means rethinking the old freedom brand—and deliberately choosing the new personality, communicators, and content to fit our present moment, he writes for The American Interest:
The original “freedom brand” the United States built up in the Cold War long ago lost its coherence. The challenge for today’s public diplomats and broadcasters is to find aspects of the American idea that are still powerful and resonate with Russian audiences. To understand this will require consistent and in-depth social media sentiment analysis and target audience analysis. However, the over-arching idea should be the Pursuit of Happiness, with a sequence of supporting themes.
Open societies’ robust institutions are vital in the fight against disinformation, according to analyst Rasmus Kleis Neilsen, who addressed a hearing on “preserving democracy in the digital age” organized by the European Political Strategy Centre, together with Anne Applebaum (right) from the Washington Post/ LSE, Philip Howard from the Oxford Internet Institute, Philip Lelyveld from the Entertainment Technology Center at the University of Southern California and Keir Giles from Chatham House.
- Protecting news and media against governments using political/economic pressures to control them, against organized crime and extremist groups, and against politically-mandated privatization of the policing of free speech. All European Union member states have signed the Council of Europe recommendation the protection of journalism and safety of journalists and other media actors, but so far only Malta has begun to implement the recommendation.
- Creating an enabling environment for news media by reforming existing forms of indirect and in some cases direct support for private sector media (VAT exemptions, state aid/subsides) so they reward the future, not the past, support genuinely independent public service media and ensure they have autonomy and funding to deliver on their remit using all appropriate tools, enable non-profit journalism by streamlining regulation to ease the creation of non-profit news organizations and incentivize supporting them, by making support available for R&D and innovation, and ensuring transparency around media ownership and funding. …media.[xix] Even in countries with strong, independent public service media, the vast majority of investment in professional journalism comes from private sector news media and it is critically important that policymakers support the industry as it reinvents its business for a digital age.[xx]
- Creating an enabling environment for journalism by investing in training, life-long learning, up-skilling and by protecting journalists against defamation/libel suits aiming to silence them, as well as by enabling journalists and other third parties through “freedom of information” legislation and open data initiatives, plus support for individual innovation and entrepreneurship.
- Invest in media and information literacy efforts for citizens at all stages of life.
To counter Special Counsel Robert Mueller’s indictment an indictment against thirteen Russian nationals and three organization., RT, Sputnik, and the Kremlin used the 4Ds of disinformation as defined by @DFRLab’s Ben Nimmo: dismiss, deny, distort, and distract. Between February 16 and February 21, RT published seven articles that dismissed allegations in Mueller’s indictment. Sputnik News published another three. The dismissals centered around three themes, the group adds:
- First, it argued that Russia’s alleged meddling in the U.S. presidential elections did not have any effect on the outcome of the election.
- Second, it claimed that the scope of Russian communication campaigns has been blown out of proportion by the media and the Democrats.
- Third, it insisted that Russian meddling in the U.S. Presidential elections was not pro-Trump. Neither one of these themes denied the meddling took place.
National security demands that we respond to the Russian intervention in our democracy and others, notes Applebaum, a National Endowment for Democracy board member. Russian support for extremist and anti-democratic political parties all across the West has been growing over the past decade, including funding and other support as well as propaganda. This was considered a fairly niche concern when I first started writing about it a few years ago, and I’m glad it’s now getting the attention it deserves, she writes for The Washington Post:
Still, let’s be honest: The elimination of Russian influence from U.S. cyberspace would not prevent another Pizzagate. A shutdown of Russian bots will still leave swarms of American bots free to deceive American voters. By its very nature, social media makes disinformation campaigns possible on a larger scale than ever before: Its algorithms encourage deep polarization, and its promise of anonymity opens the door to fraud.