Russia, China, Iran and other countries remain interested in influencing U.S. policy, and elections are a top target, The Washington Post reports.
“We’re much better prepared in that we’re aware that there is a threat,” said Lawrence Norden, an expert on voting machines at New York University’s Brennan Center for Justice. “But we haven’t done some basic things.”
We need to define nonphysical battlefields (e.g., disinformation, cyber-terrorism) and align our interests and perceived threats with the resources and policies necessary to defend ourselves, The Washington Post’s Jennifer Rubin writes.
“Politicians need to describe, in concrete terms, how these threats implicate our safety, financial security and democracy,” she adds. “If ever we had an opportunity to explain the nexus between foreign policy and ordinary Americans’ lives, as well as the necessity of cooperation with allies in fighting threats that do not respect borders, this would be it.”
The best way to detect the next misinformation campaign is to require technology companies to put all of their ads in a publicly accessible archive, argues Professor Philip N. Howard, the director of the Oxford Internet Institute and the author of “Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up.”
Laudable policy initiatives like the bipartisan Honest Ads Act and the Bot Disclosure and Accountability Act don’t go far enough because not all foreign governments and shady lobbyists will disclose their political ads, and the public has no way of verifying that social media companies are doing their own due diligence, he writes for the New York Times:
Social media platforms aren’t being used only at election time to manipulate voters. We recently built a limited junk news aggregator and found misinformation about vaccinations and health, scientific findings, stock prices and a range of public policy issues. Investigating the impact of social media ads can have positive outcomes, for example by helping medical researchers understand which campaigns best improve public health. If officials decided to investigate suspected misinformation or manipulation during an election, they would have the evidence they need.
Moreover, a full public archive would help “future-proof” us from new problems that might arise from social media advertising. A permanent, real-time archive would let us detect and expose misinformation campaigns as they rise. Voters and regulators must get ready for a world of “deep fake” videos, intelligent bots that interact with voters and personalized political ads with keywords, voice tone, and facial features that we individually will respond to.
The British parliament’s recent forceful report on disinformation marked a trajectory toward regulation and legislation that will constrain how social media firms operate in regard to not only disinformation but also broader social and economic concerns including transparency, privacy, and competition, notes Dipayan Ghosh, a Fellow at New America and the Harvard Kennedy School.
Rather than overbearing regulation, we need a sensible “digital social contract” to rebalance the implicit power held by the triumvirate of the internet industry, the government, and the individual consumer. Such a framework should address the following three principal elements, he writes for the Harvard Business Review:
- Transparency: consumers of online media often are unaware of the provenance of the online content they see, and how they were chosen to see it. This is a root cause of the disinformation problem, and is viewed by many as the simplest of the digital giants’ offenses to regulate and a good starting place for legislators….A bill introduced in the Senate known as the Honest Ads Act would begin to address this critical gap in the United States…. But such legislation must be balanced with the need to protect intellectual property rights….
- Privacy: the protection of individual privacy has long been considered a fundamental human right by jurisdictions such as the European Union, but the United States offers no such protection. …Advocates for federal privacy legislation are arguing for increased options for consumers to control the collection and use of their personal data and robust security standards for the companies that collect this data. …But these restrictions challenge firms’ continued commercial success in the face of competition originating from China and beyond. ……Finally, while industry rhetoric such as Mark Zuckerberg’s recently stated “pivot” to privacy might suggest that the “digital gangsters” will voluntarily make key changes to their platforms that favor privacy, these promises are likely superficial at best….
- Competition…In my view, the direction and nature of all of this examination of Silicon Valley is entirely fair; many have argued that the aforementioned three firms have tended toward natural monopoly, at times unfairly so – all of which appears to necessitate a resuscitation of the nation’s anti-trust and competition policy frameworks so that their full force can be brought to bear against the industry. Pro-competition regulations for this sector would in fact encourage a more diverse and dynamic internet ecosystem – which would open up new business opportunities for younger and more innovative commercial actors to participate in our information and media ecosystem.
- Any transparency imposed on their algorithmic ranking regimes would go some way in exposing the ways that their algorithms work, Ghosh adds:
In the medium term, a fair compromise might institute such measures only on content that particularly affects the functioning of democracy – including, for instance, content that is political in nature, or content that is covered by civil rights protections. Facebook, for instance, has made voluntary commitments to this end earlier this month after lengthy negotiations with public advocates including the American Civil Liberties Union.
Though the U.S. has historically favored freedom of markets enabling innovation, the integrity of democracy comes first and foremost, he insists.
International initiatives to deal with the Kremlin’s propaganda have been very well summarized by the UK’s Foreign and Commonwealth Office as ‘Engage, Expose and Enhance,’ notes Soviet Subversion, Disinformation and Propaganda: How the West Fought Against it: An Analytic History, with Lessons for the Present, a report by Nicholas J. Cull, Vasily Gatov, Peter Pomerantsev, Anne Applebaum and Alistair Shawcross:
- The ‘Engage’ approach responds by seeking to develop relationships between the West and the regions most directly in the Kremlin’s sights – Ukraine, Georgia, the Baltic states – using the conventional tools of civic engagement and cultural relations to reduce some of the tensions between minority groups and majority populations of the kind that have been exploited in Ukraine. ….
- The ‘Expose’ approach is the one most obviously aimed at disinformation. It involves actively tracking this activity and publicizing it in order to make explicit the attempted manipulation of populations. Institutions which track and expose fake news and disinformation are often run by volunteers or hobbyists with backgrounds in journalism, journalism education or digital forensics. Among the best known at StopFake and Bellingcat, however these deal mostly with activity in Europe. The US government has its own site, managed by Voice of America and Radio Free Europe called Polygraph….
- The ‘Enhance’ approach is an approach which seeks to improve the quality of indigenous media in the targeted region, because populations which have trusted 73 media within their own communities will be less likely to believe others. External attempts to enhance the media environment in target areas include the expansion of the BBC’s Russian provision through television, the upgrading of the Radio Free Europe/Radio Liberty through the Current Time project and the creation of a content factory and news hub in line with proposals from the European Endowment for Democracy.
The US Department of State Bureau of Democracy, Human Rights and Labor (DRL) has published a Notice of Funding Opportunity on Combatting Elections-related Disinformation and Hate Speech in Sri Lanka.