Home Politics AI’s election year wasn’t exactly what everyone expected

AI’s election year wasn’t exactly what everyone expected

0 comments
AI's election year wasn't exactly what everyone expected

A lot of AI-generated content was used to express support or fanaticism for certain candidates. For example, an AI-generated video was posted of Donald Trump and Elon Musk dancing to the BeeGees song “Stayin’ Alive.” Shared millions of times on social mediaincluding Sen. Mike Lee, R-Utah.

“It’s about social signaling. It’s all the reasons people share these things. It’s not AI. “We are seeing the effects of a polarized electorate,” says Bruce Schneier, a public interest technologist and professor at the Harvard Kennedy School. “It’s not like we’ve had perfect elections throughout our history and now all of a sudden AI comes along and it’s all misinformation.”

But don’t twist it: there were Misleading deepfakes that spread during this election. For example, in the days leading up to the elections in Bangladesh, Deepfakes circulated online encouragingly. supporters of one of the country’s political parties to boycott the vote. Sam Gregory, program director at the nonprofit Witness, which helps people use technology to support human rights and runs a rapid response screening program for civil society organizations and journalists, says his team saw an increase in deepfake cases this year.

“In multiple electoral contexts,” he says, “there have been examples of misleading or confusing real-world use of synthetic media in audio, video and image formats that have baffled journalists or been unable to fully verify or question them. “What this reveals, he says, is that the currently existing tools and systems for detecting AI-generated media are still lagging behind the pace at which the technology is developing. In places outside the United States and Western Europe, these detection tools are even less reliable.

“Fortunately, AI was not used on a large scale in most elections or in a fundamental way, but it is very clear that there is a gap in detection tools and access to them for the people who need it most ” says Gregory. “This is not the time for complacency.”

The mere existence of synthetic media, he says, has meant that politicians have been able to claim that real media are fake, a phenomenon known as the “liar’s dividend”. in AugustDonald Trump alleged that images showing large crowds of people attending rallies for Vice President Kamala Harris were generated by AI. (They weren’t.) Gregory says that in an analysis of all reports to the Witness rapid response force, about a third of the cases were politicians using AI to deny evidence of a real event, many of which involved leaked conversations.

You may also like