HomeTech Fuzzy chins, weird hands, unreliable numbers: seven signs you’re watching a deepfake

Fuzzy chins, weird hands, unreliable numbers: seven signs you’re watching a deepfake

0 comment
Fuzzy chins, weird hands, unreliable numbers: seven signs you're watching a deepfake

In a crucial election year for the world, with the United Kingdom, the United States and France among the countries that will go to the polls, misinformation is circulating on social media.

There is much concern about deepfakes, or artificial intelligence-generated images or audio of prominent political figures, designed to mislead voters, and whether they will affect the results.

They have not been a major feature of UK elections so far, but there has been a steady supply of examples from around the world, including in the US, where a presidential election is looming.

Here are the visual elements to keep in mind.

Oddity around the mouth or chin

In deepfake videos, the area around the mouth can be the biggest giveaway. There may be fewer wrinkles in the skin, less detail around the mouth, or the chin looks blurry or smudged. Poor synchronization between a person’s voice and mouth can be another sign.

This deepfake video posted on June 17 shows a simulation of Nigel Farage destroying Rishi Sunak’s house in Minecraft. It is part of a deepfake satire trend Videos showing politicians playing the online game.

A couple of days later, another simulated video Keir Starmer appeared playing Minecraft and setting a trap in “Nigel’s pub”.

Dr Mhairi Aitken, an ethics researcher at the Alan Turing Institute, the UK’s national artificial intelligence institute, says the first sign that deepfakes have occurred in Minecraft is, of course, “the ridiculousness of the situation”. But another sign that the media has been generated by artificial intelligence or manipulated is the imperfect synchronization between the voice and the mouth.

“This is particularly clear in the segment where Farage speaks,” says Aitken.

Another clue, Aitken says, is whether shadows fall in the right place or whether the lines and wrinkles on your face move when you expect them to.

Ardi Janjeva, a research associate at the institute, adds that the low resolution of the video is another obvious sign that people should take into account because “it seems somewhat improvised.” She claims that people are familiar with this amateur method due to the prevalence of “low-resolution, rudimentary, fraudulent email attempts.”

This low-fidelity approach then manifests itself in obvious areas like the mouth and jaw, he says. “It’s noticeable in facial features like the mouth, where viewers tend to focus their attention, where there is excess blurring and smearing.”

Strange elements of speech

Another deepfake video of Keir Starmer selling an investment plan was made by editing the audio over his New Year 2023 address video.

If you listen carefully, you’ll notice that the sentence structure is odd, and Starmer says “pounds” before the number several times, for example “pounds 35,000 a month.”

Aitken says that, again, the voice and mouth are not synchronized and the lower facial area is blurred. The use of “pounds” before a number indicates that a text-to-audio tool was likely used to recreate Starmer’s voice, he adds.

“This is probably an indication that a tool has been used to convert written words into speech, without checking that it reflects typical spoken word patterns,” he says. “There are also some clues in the intonation. This maintains a fairly monotonous rhythm and pattern throughout. To check the authenticity of a video it is a good idea to compare the voice, gestures and expressions with actual recordings of the individual to see if they are consistent.”

Consistency between face and body.

This deepfake video of Ukrainian President Volodymyr Zelenskiy calling on civilians to lay down their arms in front of the Russian military circulated in March 2022. The head is disproportionately sized to the rest of the body and there is a difference in skin tones between the neck and face.

Hany Farid, a professor at the University of California at Berkeley and a deepfake detection specialist, says this is an “old-school deepfake.” The motionless body is an indication, he says. “The telltale sign of this so-called puppeteer deepfake is that the body below the neck does not move.”

Discontinuity throughout the video clip.

This video, which circulated in May 2024, falsely shows US State Department spokesman Matthew Miller justifying Ukrainian military attacks on the Russian city of Belgorod by telling a journalist that “there are virtually no civilians left in Belgorod.” . The video was tweeted by the Russian embassy in South Africa and later deleted, according to A BBC journalist.

Skip newsletter promotion

The fake video shows the spokesman’s tie and shirt changing color from one point in the video.

While this is a relatively noticeable change, Farid notes that the generative AI landscape is changing rapidly and, therefore, so are deepfake indicators. “We also need to always practice good information consumption habits that include a combination of good common sense and a healthy dose of skepticism when presented with particularly outrageous or improbable claims,” he says.

Extra fingers, hands and limbs

Look out for an excess of strange-looking fingers, legs, arms, and hands in AI-generated still images.

In April 2023, a photograph circulated on Twitter purportedly showing US President Joe Biden and Vice President Kamala Harris celebrating the impeachment of Donald Trump.

A doctored photo of Kamala Harris and Joe Biden hugging in the Oval Office.

Signs that it could have been generated Kamala Harris’ right hand has six fingers. The top of the flag is distorted and the pattern on the ground is also crooked.

The Reality Defender AI team, a Deepfake detection companysays that cues entered into imaging tools can focus on people (often the names of well-known people), resulting in results that emphasize faces. As a result, artifice is often revealed in other details, such as hands or physical backgrounds, as in the case of the Biden-Harris image.

“Typically, guidelines for creating these types of images place greater emphasis on the people in them, particularly the faces,” explains the Reality Defender team. “Therefore, productions often create believable human faces with higher frequency details, while deprioritizing the physical consistency of the background (or, in many cases, other body parts, such as hands).”

However, Reality Defender, which uses deepfake detection tools, says the increasing complexity of generative AI programs means manual scrutiny of deepfakes is becoming “decidedly less reliable.”

Letters and numbers torn apart

AI image generators have difficulty reproducing numbers and text. This fake mugshot of Trump published in April 2023 was taken with said tool. You can see how in the background, instead of a height graph, there is a meaningless combination of numbers and letters.

“Numbers and text in the background are a giveaway,” says Aitken. “AI image generators have a very difficult time producing text or numbers. They don’t understand the meaning of the symbols they produce, so they often produce garbled or illegible text and numbers. If there is text or numbers in an image, zooming in on them can be a really good way to identify whether they may be AI-generated.”

A doctored photograph of Donald Trump standing in front of a height chart. There are yellow squares that annotate messy text.

Choppy video edits

Some manipulated images are put together so amateurishly that they are easy to spot. Known as “cheapfakes,” these typically use simple video editing software and other low-fidelity techniques.

Just before the Mexican elections, a video of then-presidential candidate Claudia Sheinbaum was edited to show her saying she would close churches if elected. The clip was deliberately deceptively pieced together from a video in which she actually said, “They’re saying, imagine the lie, that we’re going to close churches.” An alternate background showing Satanist symbols was also added in an attempt to make the clip even more damaging.

You may also like