By Taruni Kumar
“Computers driving our cars, beating humans at Go. Nonsense! We all know what they are for A.I. is for PORN!” This is the only explanation available on the home page of deepmindy.com, a website to which you can upload the image of a person and find similar faces in pornographic photos. You can upload your own (if that’s what you’re into), a friend, a celebrity or just about anybody at all and artificial intelligence (AI) will find you the one to fap to. Fun fact: Deep Mindy is a play on Google’s Deep Mind AI company, which describes itself on its website as “the world leader in artificial intelligence research and its application for positive impact.”
Well, positive impact is definitely not what AI is being used for at the moment. As creepy as it is that Deep Mindy allows you to find a porn star who looks similar to your desired sexual partner, they’re still photos of women who consented to being photographed (unless their photos were leaked online which is a whole other vile revenge porn genre). But what’s creepier than Deep Mindy is the new genre of AI made fake porn video doing the rounds of the interwebz which has the faces of celebrities morphed onto the bodies of actual porn actors. It’s better than an average Photoshop job and takes place in actual video not just static images. Sure, there’s the occasional glitch and a weird uncanny valley type effect does play out but for the most part, it’s scarily accurate.
Let’s go back to the very beginning of this phenomenon. In December 2017, tech website Motherboard posted an article about the first AI-assisted porno made by a reddit user who called himself ‘deepfakes’. The video was an incest-themed porno starring a woman with Wonder Woman star Gal Gadot’s face. Aside from some glitches here and there, it was a scarily convincing video. As is the way of the Internet, deepfakes’ reddit handle took on a noun value of its own and as more people began to use AI to morph porn, the videos began to be called deepfakes.
And before anybody says, “yeh western phenomenon hai, hamara Indian culture nahin,” there are deepfakes of Deepika Padukone, Priyanka Chopra and Shraddha Kapoor on PornHub already. And I assure you, they’re incredibly disturbing.
What’s even more disturbing is that the technology being used in the creation of these deepfakes isn’t coming from a sophisticated tech lab with insane hardware. Deepfakes are being made by regular people with slightly more powerful than average computers using open source, i.e., freely available, tools from the Internet including Google’s own TensorFlow machine learning framework. In fact, a Reddit user called deepfakeapp even built, as his handle suggests, a deepfake app that can be used by anybody without needing the technical know-how of TensorFlow or Python or any of the other tools required to create deepfake videos.
As a fantastic headline from Quartz sums up, “Google gave the world powerful AI tools and the world made porn with them”. On the other hand, this ain’t exactly new because the internet’s most cutting edge innovations have all come out of the porn industry from payment gateways to streaming video.
Keeping in line with how the world usually works, most of these deepfake porn videos are of famous women. There’s at least one deepfake of a man, Nicholas Cage, but that’s not a porn video. It’s a compilation of movie scenes with his face morphed into them. No, really, deepfakes are just another way for men to fulfil their fantasies of taking control of women’s agency.
Let’s be clear. I am not doing haaw, sex! I am not anti-porn. But I just like pro-porn and pro-pleasure positions to go with a pro-consent position. And deepfakes have zero consensual element whatsoever. The celebrity women whose faces are being used for these videos did not consent to them and the actors in the videos who did consent are being erased out. This hasn’t gone unnoticed. In February, Reddit banned deepfakes and PornHub started taking down deepfake flagged videos. But once something is on the World Wide Web, it’s nearly impossible to get rid of it.
But here’s the thing. While deepfakes are awful and really need to just go away, the technology behind the videos may have some interesting implications for the world of media. Here’s an example:
In this scene from Star Wars: Rogue One, the top frame has the original CGI version of a young Princess Leia while the bottom frame is almost as good but a much cheaper and faster version of it using the deepfakes tech.
So, the question is, is all AI-assisted video bad or are just deepfakes porno bad? The answer isn’t so simple because in these times of fake news, morphed videos that are more difficult to spot don’t bode well, especially when we’re still struggling with how to deal with modified static images and false text. There is already a video doing the rounds of an AI-generated visual of former US President Barack Obama lip syncing to an audio track of him speaking which looks surprisingly real. And Face2Face, a real-time face tracker allows an output that can look like anybody you desire – from George Bush to Vladimir Putin.
But perhaps deepfakes will begin a conversation and a counter tech movement to build tools that will make it easier to spot modified videos: A development that we would all welcome and that may prevent incidents like the doctored JNU videos released by Zee News in early 2016 that led to sedition charges and the arrest of JNU students Kanhaiya Kumar, Anirban Bhattacharya and Umar Khalid.
It does make one wonder though. When Elon Musk talked about AI’s potential to become a digital dictator, I don’t think he was talking about revolutionising the porn industry. But then again, this is the same man who “unwittingly” attended a sex party last summer where he claimed he saw no sex.
Co-published with Firstpost.
Leave a Reply