Home Tech ‘Inceptionism’ and Balenciaga popes: a brief history of deepfakes

‘Inceptionism’ and Balenciaga popes: a brief history of deepfakes

by Elijah
0 comment
‘Inceptionism’ and Balenciaga popes: a brief history of deepfakes

cAttention to manipulated or manipulative media is always high around election cycles, but this will be different in 2024 for two reasons: deepfakes created by artificial intelligence (AI) and the sheer number of polls.

The term deepfake refers to a hoax that uses AI to create a false image, usually fake videos of people, with the effect often exacerbated by a voice component. Combined with the fact that around half the world’s population is holding major elections this year – including India, the US, the EU and most likely the UK – and there is the potential for the technology to be highly disruptive.

Here’s a guide to some of the most effective deepfakes of recent years, including the first attempts at creating fake images.

Image from Google’s blog post called Inceptionism. Photo: Google

DeepDream’s banana 2015

The banana where it all started. In 2015, Google published a blog post about what it called “inceptionism,” but which soon became known as “DeepDream.” In it, engineers from the company’s photography team asked a simple question: What happens if you take the AI ​​systems Google had developed to label images – known as neural networks – and ask them to create images instead?

“Neural networks trained to distinguish between different types of images also possess a fair amount of the information needed to generate images,” the team wrote. The resulting hallucinatory dreamscapes were hardly hi-fi, but they showed the promise of the approach.

Celebrity face swaps 2017

It is difficult to generate images completely from scratch. But using AI to make changes to pre-existing photos and videos is a little easier. In 2017, the technology was absolutely groundbreaking and required a powerful computer, a lot of footage to learn from, and the time and resources to master tools that weren’t user-friendly.

But one use case fell squarely within those boundaries: trading female celebrities for pornography. In late 2017, such explicit clips were being created and traded at a remarkable pace, initially on Reddit, and then, as rumors spread and anger increased, on darker and more hidden forums.

According to a briefing note from think tank Labor Together, “one in three deepfake tools let users create deepfake pornography in less than 25 minutes, at no cost.”

Fake video of Obama made by Jordan Peele Photo: Jordan Peele

Jordan Peele/Obama video 2018

Where pornography led, the rest of the media followed, and by mid-2018, face-swapping tools had improved to the point that they could be used as a creative tool in their own right. In one of these videos, produced by BuzzFeed, actor and director Jordan Peele did an impression of Barack Obama – turned into actual footage of the president himself, and ending with an exhortation to “stay awake, bitches.”

Armchairs in the shape of an avocado generated by Dall-E. Photo: Dall-E/Twitter

Dall-E’s avocado armchair 2021

In 2021, OpenAI released Dall-E and face-swapping became old news. The first major image generator, Dall-E, offered the science fiction promise of typing in a sentence and pulling out a photo. Sure, those photos weren’t particularly good, but they were images that had never existed before — not simply remixed versions of previous photos, but entirely new things.

The first version of Dall-E wasn’t great at photorealism, with OpenAI’s demo selection showing one vaguely realistic image set, a bunch of photos of an eerily apprenticeless cat. But it was already promising for more figurative art, such as this armchair.

Zelensky’s deepfake, from March 2022 Photographer: .

Zelensky orders his party to surrender in 2022/2023

Within a month of Russia’s invasion of Ukraine, an amateur deepfake emerged in which President Volodymyr Zelensky called on his soldiers to lay down their weapons and return to their families. It was of poor quality, but prompted the real Ukrainian president to hit back on his Instagram account, telling Russian soldiers to return home instead.

But then the deepfake Zelensky came back a year later and underlined how the technology had improved. This clip encouraged Ukrainian soldiers to surrender again and was more convincing.

NewsGuard, an organization that tracks disinformation, says comparing the two videos shows how far technology has advanced in a short time.

McKenzie Sadeghi, the editor of AI and foreign influence at NewsGuard, said the November 2023 clip “marked a significant leap forward in deepfake technology since the outbreak of war between Russia and Ukraine in February 2022, and reflects how manipulated content is becoming more persuasive has become as technology advances. ”

Sadeghi said the 2023 deepfake’s movements are “smooth and natural” and his mouth movements better match the spoken words.

An AI-generated image of Pope Francis wearing a padded jacket. Photo: Reddit

The Pope in a padded jacket 2023

An image of Pope Francis, apparently wearing a Balenciaga quilted jacket, was a milestone in generative AI and deepfakery. Created by the Midjourney image creation tool, it quickly went viral for its stunning realism.

“The Pope statue showed us what generative AI is capable of and showed us how quickly this content can spread online,” said Hany Farid, a professor at the University of California, Berkeley, and a specialist in deepfake detection.

Although the Pope statue was shared for its combination of realism and the absurd, it underlined the creative power of now widely accessible AI systems.

An image of Trump and black voters has been circulating on social media since March 2024. Photo: @Trump_History45

Trump with black voters 2024

This year, a doctored image emerged of Donald Trump posing with a group of black men on a doorstep. There is also an image of Trump posing with a group of black women. The former president, who will face Joe Biden in the 2024 presidential election, is a popular subject of deepfakes – just like his opponent.

“The image of Trump with black supporters shows us how the visual record can and is being polluted with fake images,” says Farid.

In a blog that collects political deepfakes, Farid says the image “appears to be an attempt to woo black voters.” Farid expresses general concern that deepfakes will be “used in politics” this year.

Screenshot of a fake video of Joe Biden talking about Kiev. Photo: Guardtech

Joe Biden robocalls 2024

There are countless examples of manipulative use of the American president’s image and voice. In January, Biden’s fake vote was used to encourage Democrats not to vote in a New Hampshire primary, even using the Biden-esque phrase “what a bunch of malarkey.” Steve Kramer, a political operative, admitted he was behind the hoax calls. Kramer worked for Biden’s challenger, Dean Phillips, whose supporters have experimented with AI technology by creating a short-lived Phillips voting bot. Phillips’ campaign said the challenger had nothing to do with the call. Kramer has said he did it to highlight the dangers of AI in elections.

You may also like