• Home
  • News
  • Longform
  • Life
    • Jeans
    • Madam Zeenat’s Feminist Tarot
  • Health
  • Work
  • Culture
    • Books
    • Art
    • Music
    • Fashion
    • Cinema
    • TV
  • Sports
  • Kranti
  • Vaanthi
  • The FAK
  • About
    • Submissions
  • Home
  • News
  • Longform
  • Life
    • Jeans
    • Madam Zeenat’s Feminist Tarot
  • Health
  • Work
  • Culture
    • Books
    • Art
    • Music
    • Fashion
    • Cinema
    • TV
  • Sports
  • Kranti
  • Vaanthi
  • The FAK
  • About
    • Submissions
HomeCultureThe Days of Bad Photoshop Are Over, AI Fake Porn i ...

The Days of Bad Photoshop Are Over, AI Fake Porn is Here and We’re Hoping it Goes Away

April 14, 2018

By Taruni Kumar

Photo Courtesy: Twitter via @optimisticrazen

“Computers driving our cars, beating humans at Go. Nonsense! We all know what they are for A.I. is for PORN!” This is the only explanation available on the home page of deepmindy.com, a website to which you can upload the image of a person and find similar faces in pornographic photos. You can upload your own (if that’s what you’re into), a friend, a celebrity or just about anybody at all and artificial intelligence (AI) will find you the one to fap to. Fun fact: Deep Mindy is a play on Google’s Deep Mind AI company, which describes itself on its website as “the world leader in artificial intelligence research and its application for positive impact.”

Well, positive impact is definitely not what AI is being used for at the moment. As creepy as it is that Deep Mindy allows you to find a porn star who looks similar to your desired sexual partner, they’re still photos of women who consented to being photographed (unless their photos were leaked online which is a whole other vile revenge porn genre). But what’s creepier than Deep Mindy is the new genre of AI made fake porn video doing the rounds of the interwebz which has the faces of celebrities morphed onto the bodies of actual porn actors. It’s better than an average Photoshop job and takes place in actual video not just static images. Sure, there’s the occasional glitch and a weird uncanny valley type effect does play out but for the most part, it’s scarily accurate.

Let’s go back to the very beginning of this phenomenon. In December 2017, tech website Motherboard posted an article about the first AI-assisted porno made by a reddit user who called himself ‘deepfakes’. The video was an incest-themed porno starring a woman with Wonder Woman star Gal Gadot’s face. Aside from some glitches here and there, it was a scarily convincing video. As is the way of the Internet, deepfakes’ reddit handle took on a noun value of its own and as more people began to use AI to morph porn, the videos began to be called deepfakes.

And before anybody says, “yeh western phenomenon hai, hamara Indian culture nahin,” there are deepfakes of Deepika Padukone, Priyanka Chopra and Shraddha Kapoor on PornHub already. And I assure you, they’re incredibly disturbing.

What’s even more disturbing is that the technology being used in the creation of these deepfakes isn’t coming from a sophisticated tech lab with insane hardware. Deepfakes are being made by regular people with slightly more powerful than average computers using open source, i.e., freely available, tools from the Internet including Google’s own TensorFlow machine learning framework. In fact, a Reddit user called deepfakeapp even built, as his handle suggests, a deepfake app that can be used by anybody without needing the technical know-how of TensorFlow or Python or any of the other tools required to create deepfake videos.

As a fantastic headline from Quartz sums up, “Google gave the world powerful AI tools and the world made porn with them”.  On the other hand, this ain’t exactly new because the internet’s most cutting edge innovations have all come out of the porn industry from payment gateways to streaming video.

Keeping in line with how the world usually works, most of these deepfake porn videos are of famous women. There’s at least one deepfake of a man, Nicholas Cage, but that’s not a porn video. It’s a compilation of movie scenes with his face morphed into them. No, really, deepfakes are just another way for men to fulfil their fantasies of taking control of women’s agency.

Let’s be clear. I am not doing haaw, sex! I am not anti-porn. But I just like pro-porn and pro-pleasure positions to go with a pro-consent position. And deepfakes have zero consensual element whatsoever. The celebrity women whose faces are being used for these videos did not consent to them and the actors in the videos who did consent are being erased out. This hasn’t gone unnoticed. In February, Reddit banned deepfakes and PornHub started taking down deepfake flagged videos. But once something is on the World Wide Web, it’s nearly impossible to get rid of it.

But here’s the thing. While deepfakes are awful and really need to just go away, the technology behind the videos may have some interesting implications for the world of media. Here’s an example:

In this scene from Star Wars: Rogue One, the top frame has the original CGI version of a young Princess Leia while the bottom frame is almost as good but a much cheaper and faster version of it using the deepfakes tech.

So, the question is, is all AI-assisted video bad or are just deepfakes porno bad? The answer isn’t so simple because in these times of fake news, morphed videos that are more difficult to spot don’t bode well, especially when we’re still struggling with how to deal with modified static images and false text. There is already a video doing the rounds of an AI-generated visual of former US President Barack Obama lip syncing to an audio track of him speaking which looks surprisingly real. And Face2Face, a real-time face tracker allows an output that can look like anybody you desire – from George Bush to Vladimir Putin.

But perhaps deepfakes will begin a conversation and a counter tech movement to build tools that will make it easier to spot modified videos: A development that we would all welcome and that may prevent incidents like the doctored JNU videos released by Zee News in early 2016 that led to sedition charges and the arrest of JNU students Kanhaiya Kumar, Anirban Bhattacharya and Umar Khalid.

It does make one wonder though. When Elon Musk talked about AI’s potential to become a digital dictator, I don’t think he was talking about revolutionising the porn industry. But then again, this is the same man who “unwittingly” attended a sex party last summer where he claimed he saw no sex.

Co-published with Firstpost.

Tags: ai, ai porn, artificial intelligence, deepfakes, deepika padukone, Gal Gadot, porn, pornhub, priyanka chopra, reddit

Share!
Tweet

Taruni Kumar

About the author

Related Posts

Nutty Conspiracy Theories about Porn Jihad Have One Truth at the Heart of it

Ladies, PornHub’s Got a Monthly Treat For You

What Do Women Really Want? PornHub Seems to Have the Answer

A Karnataka BJP Lawmaker Made a Little Oopsie on a WhatsApp Group and People Want to See Him Prosecuted for It

Leave a Reply Cancel reply

Trending

Sorry. No data so far.

Subscribe to our email newsletter!

You May Also Like

  • After Payal Tadvi’s Death Can We Allow Medical Education to Continue to Pretend to be Casteless? May 28, 2019
  • Jokha Alharthi’s Man Booker Win Reminds Us of Oman’s Recent Slave-owning Past May 24, 2019
  • In Avengers: Endgame, Black Widow is Sexy, Sterile and So Burnt By Marvel May 12, 2019
  • Let Us Admit the Sins of Atishi Marlena May 10, 2019
  • 20 Questions for SC Panel that Cleared the CJI of Sexual Harassment Charges May 9, 2019


Online Bachchi, Dil Ki Sachchi

Come on over for feminist journalism.

Politics. Pop Culture. Health. Sex. Law. Books. Work.

We write what we want to read.

  • Terms & Conditions
  • Privacy Policy
  • Refund Policy
  • About
  • Contact Us

Subscribe to our email newsletter!

Keep up with us!

Follow Us on FacebookFollow Us on TwitterFollow Us on YouTubeFollow Us on E-mail

Trending

Sorry. No data so far.

Copyright © 2018 The Ladies Finger
Subscribe to our RSS/Atom feed here