Home Money Deepfakes are evolving. This company wants to catch them all

Deepfakes are evolving. This company wants to catch them all

0 comment
A real-time facial recognition gif used by Will Knight

Some Fortune 500 companies have begun testing software that can detect a fake of a real person on a live video call, following a series of scams involving fraudulent job applicants who accept a signing bonus and walk away.

The detection technology is courtesy of Get Real Labs, a startup founded by Hany Faridprofessor at UC-Berkeley and recognized authority on deepfakes and image and video manipulation.

Get Real Labs has developed a set of tools to detect images, audio and videos generated or manipulated with artificial intelligence or manual methods. The company’s software can analyze the face in a video call and detect clues that may indicate that it has been artificially generated and swapped with the body of a real person.

“These are not hypothetical attacks, we are hearing about them more and more,” says Farid. “In some cases, it seems that they are trying to obtain intellectual property, infiltrating the company. In other cases, it seems purely economic, they just take the signing bonus.”

The FBI issued a warning in 2022 about fake job seekers who assume the identity of a real person during video calls. Arup, UK-based design and engineering company lost $25 million for a fake scammer posing as the company’s CFO. Romance scammers have also embraced technology, scamming unsuspecting victims out of their savings.

Impersonating a real person in a live video is just one example of the type of deception that is now possible thanks to AI. Large language models can convincingly imitate a real person in an online chat, while tools like OpenAI’s Sora can generate short videos. Impressive AI advances in recent years have made deepfakering more compelling and accessible. Free software makes it easy to hone deepfakering skills, and easily accessible AI tools can turn text prompts into realistic-looking photos and videos.

But impersonating a person in a live video is a relatively new frontier. Creating this type of deepfake typically involves using a combination of machine learning and facial tracking algorithms to seamlessly match a fake face with a real one, allowing an intruder to control what an illicit image appears to say and do. on the screen.

Farid showed WIRED a demonstration of Get Real Labs’ technology. When a photograph of a corporate boardroom is displayed, the software analyzes the metadata associated with the image for signs that it has been modified. Several major AI companies, including OpenAI, Google, and Meta, now add digital signatures to AI-generated images, providing a strong way to confirm their inauthenticity. However, not all tools provide such stamps, and open source image generators can be configured not to do so. Metadata can also be easily manipulated.

GIF: Will Knight

You may also like