Generative AI holds promise, peril for democracies

     

Generative artificial intelligence – popularized in 2022 by Open AI’s ChatGPT application – threatens to undermine trust in democracies when misused, but may also be harnessed for public good, Sarah Kreps told the President’s Council of Advisors on Science and Technology (PCAST).

“The threat might not be that people can’t tell the difference – we know that – but that if as this content proliferates, they might just not believe anything,” said Kreps, director of the Cornell Tech Policy Institute. “If people stop believing anything, then it’s eroding a core tenet of a democratic system, which is trust.”

“Cacophonous voices may be the greatest strength of democracy,” says Daron Acemoglu, co-author (with Simon Johnson) of Power and Progress: Our Thousand-Year Struggle over Technology and Prosperity. 

We can put diversity into the funnel of artificial intelligence (AI), but what’s going to come out, he tells Democracy Paradox’s Justin Kempf, stressing “the way that we rely on human social intelligence, cognition, democratic processes, civil society” to avoid technological determinism:

We have to build institutions. We have to build ways of redirecting technological change and then we have to actually do the work or building those institutions. How do we build countervailing powers? How do we build institutions that correctly regulate new emergent technologies? How do we create a better democracy? RTWT

For decades, diplomats and international policymakers have treated technology as a “sectoral” matter best left to energy, finance, or defense ministries, note analysts Manuel Muniz and Samir Saran. But the sudden arrival of groundbreaking AI tools has created an urgent need for a more holistic and global approach to tech governance.

As governments acquire unprecedented abilities to manufacture consent and manipulate opinion, it is in democracies’ interest to develop a common approach to AI regulation, they write for Project Syndicate.

“Artificial intelligence and other digital tools are changing how governments operate,” said Christopher Walker, Vice President for Studies and Analysis at the National Endowment for Democracy (NED). “Digital technologies can be powerful tools for holding public officials accountable. But without transparency and public oversight, government collection and processing of digital data has the potential to weaken trust in government and erode democratic norms over the long term,” he adds, introducing a new essay series (above) from the NED’s International Forum for Democratic Studies.

A pioneering free, online training course is designed to help newsrooms address a defining issue of the modern age: the ethical use of AI, the Thomson Foundation reports. Together with leading industry experts, AI in the newsroom: the ethical approach is a unique NED-funded course aimed at guiding senior journalists on the ethical policies to adopt to make sure their brand remains a trusted source of news.

“When I was writing the course, barely a day went by without a headline warning of the potential risks to society from AI,” says Catherine Mackie, who’s an editorial associate with Thomson . Expert input also comes from Sabrina Argoub, a Programme Manager for Journalism AI, a global initiative that empowers news organisations to use AI. It’s a project of Polis, the journalism think tank at the London School of Economics.

Credit: Thomson Foundation

Upskilling is also crucial to the future of democracy, MIT Professor Acemoglu, co-author of The Narrow Corridor, tells The FT.

“We have to be able to empower and increase capabilities among a diverse group of workers” — including those left behind by several decades of technology-driven worker displacement, the kind of people prone to the “deaths of despair” that economists Angus Deaton and Anne Case have written about.

Print Friendly, PDF & Email