15.6 C
Wednesday, September 27, 2023
HomeScienceHumans Struggle to Detect AI-Generated Speech, Even with Training, Scientists Discover

Humans Struggle to Detect AI-Generated Speech, Even with Training, Scientists Discover


1. Unnatural eye movement. Eye movements that don’t seem natural, or a lack of eye movement, such as no eye blinking, are big red flags. It’s a challenge to replicate the act of blinking in a way that seems natural. It’s also a challenge to replicate the eye movements of a real person. That’s because someone’s eyes usually follow the person they’re talking to.

2. Unnatural facial expressions. When something doesn’t look right on a face, it could indicate a facial transformation. This occurs when one image has been stitched over another.

3. Awkward positioning of facial features. If someone’s face is pointing to one side and their nose is pointing the other way, you should be skeptical about the authenticity of the video.

4. Lack of emotion. It can also detect what’s known as ‘facial morph’ or image points if someone’s face doesn’t seem to show the emotion that should accompany what they’re supposed to be saying.

5. Uncomfortable looking body or posture. Another sign is if a person’s body shape does not look natural, or if there is an awkward or inconsistent position of the head and body. This may be one of the easiest inconsistencies to spot, because deepfake technology typically targets facial features rather than the entire body.

6. Unnatural body movement or body shape.. If someone looks distorted or off when turning to the side or moving their head, or their movements are jerky and disjointed from one frame to the next, you should suspect that the video is fake.

7. Unnatural coloring. Unusual skin tone, discoloration, strange lighting, and out-of-place shadows are all signs that what you’re seeing is probably fake.

8. Hair that doesn’t look real. You will not see frizzy or loose hair. Because? Fake images will not be able to generate these individual characteristics.

9. Teeth that don’t look real. Algorithms may not be able to generate individual teeth, so the absence of individual tooth outlines could be a clue.

10. Blurring or misalignment. If the edges of the images are blurry or the images are misaligned, for example where someone’s face and neck meet their body, you know something is wrong.

11. Noise or inconsistent audio. Deepfake creators often spend more time on video footage than on audio. The result can be poor lip syncing, robotic-sounding voices, strange pronunciation of words, digital background noise, or even no audio at all.

12. Images that look unnatural when slowed down. If you’re watching a video on a screen that’s larger than your smartphone or have video editing software that can slow down video playback, you can zoom in and examine the images more closely. Zooming in on the lips, for example, will help you see if they’re really talking or if the lip sync is bad.

13. Label discrepancies. There is a cryptographic algorithm that helps video creators prove that their videos are authentic. The algorithm is used to insert hashtags at certain places throughout a video. If the hashtags change, you should suspect video manipulation.

14. Digital fingerprints. Blockchain technology can also create a fingerprint for videos. While not foolproof, this blockchain-based verification can help establish a video’s authenticity. Is that how it works. When a video is created, the content is recorded in a ledger that cannot be changed. This technology can help prove the authenticity of a video.

15. Reverse Image Searches. An original image search, or a computer-assisted reverse image search, can uncover similar videos online to help determine if an image, audio, or video has been altered in any way. While the reverse video lookup technology is not publicly available yet, investing in a tool like this could come in handy.

The author of what'snew2day.com is dedicated to keeping you up-to-date on the latest news and information.

Latest stories