In 1996, when I was 10, I played a computer game at a friend’s house called Spycraft: The Big Game. In the game, you play as a CIA agent investigating a murder plot; to deceive a suspect during an interrogation, you have the option of doctoring a photo. The process blew my 10-year-old mind — so much so that I remembered how powerful that mini-game felt, all those years. While it was blurry and grainy, the photo editor that appeared in it was Spycraft looked a bit like what Adobe Photoshop would one day become. In 1996, it felt like the stuff of high-tech espionage and deceit. In 2023, it will be completely commonplace. It’s not difficult or expensive to change a photo – not anymore. Anyone can do it, and as a result we’ve all come to accept that we can’t trust any image we see.
Deepfake technology has already proven that we cannot trust video or audio recordings either. And the prevalence of generative artificial intelligence has made creating such deepfakes even easier. We all need to get used to this new reality – and fast.
Genna Bain, the wife of the late YouTuber John “TotalBiscuit” Bain, posted on Twitter last week about a new concern she faces due to the advancement of AI technology: “Today was fun. Facing making a choice to scrub all of my late husband’s life worth of content from the internet. Apparently people think it’s okay to use his library to train voting AIs to promote their social commentary and political views.” In response, she received sympathy and pleas from her husband’s fans to preserve his online legacy.
But here’s the problem. There is no practical way that Genna Bain, or anyone in her position, can adequately prevent anyone from making a deepfake video or audio clip of John Bain. Only one few minutes of sound are needed to train an AI to mimic a voice; for a video deepfake you mainly need footage of multiple facial expressions and angles. So if you want to avoid ever appearing in a deepfake, you’ll need to delete every visual and auditory record of your existence, which is pretty much impossible for anyone using a smartphone to do as well are impossible. That’s even more true for a public figure like Bain, who has appeared on shows and podcasts his wife can’t necessarily delete, and whose face and voice are also forever stored on the hard drives of his fans around the world. . world.
In the 1990s and 2000s, Photoshop made it possible for people to paste celebrity faces onto other people’s naked bodies, and in 2018, the public learned how AI technology could be used to video pornography that appeared to depict celebrities. Since then, the technology has only become more accessible. Googling “free deepfake app” yields plenty of options for editing software. In 2023, this technology is likely still being used to make celebrity porn, just like it was in 2018, but people are also using it to make celebrities say crazy things these days. The internet always runs on porn, but it also runs on memes, so this tracks.
If you become famous enough, you will be dehumanized and objectified in this way, and your own fans will be surprised and confused if you oppose it. You can’t stop it either. But this is not an article where I try to convince people to feel sorry for famous people. (That’s also a losing battle, although I try to fight at times.) This is instead an article where I try to convince people not to trust the video and audio they see and hear.
We’ve all had to get used to Photoshop’s existence for a long time. Seeing an image that has been cleverly faked can still mislead an intelligent, reasonable person into believing something that is not true. It’s human nature to want to believe in something that looks real; After all, seeing is believing, right? That said, I’ve been through the rise of Photoshop and so I think we’ll adapt to this as well.
I don’t know what the future holds, or what kinds of regulation we will need to address this situation. But one thing I do know: it’s already here. We live in a world where this kind of counterfeiting is not only possible, but ridiculously easy. We must now accept that it is there and move on to a reality where our skepticism extends to even more types of deceit.
But well, at least the memes will be awesome.