As the ‘face swapping’ technology behind deepfakes continues to rapidly improve, it seems as though the media moral panic over the use of it in fake porn is being renewed. Two major media reports this week asked – ‘Are deepfakes the new revenge porn?’.
Feminist minded culture magazine VOX published an interesting if very one-sided article that very firmly answered yes to its own question. Predictably, it also had a pop at VR porn as well, somewhat bizarrely through the mouth of a Swedish feminist porn producer named Emma Lust :
She continues, “This is another example of how women are objectified and reduced to sexual objects to be the butt of a joke or used for the pleasure of a male viewer.” She notes that Wonder Woman star Gal Gadot was one of the first celebrities to be used in a deepfake. “The fact that she featured in the highest grossing female-led film of all time and was chosen to be used in a deepfake was, in my opinion, a way to put her ‘back in her place’ as a sexual object and not a powerful women in control of her own agency.”
Lust feels the free Tube sites, such as PornHub and RedTube, have not taken adequate responsibility for policing and removing DeepFake material. “They are still so easy to find,” she says. “Instead of waiting for the users searching for the videos to report them, these sites need to take action and crackdown on the users uploading these videos.”
On the future of porn, Lust doesn’t see us spending our time with VR headsets on or obsessing over ‘virtual’ porn stars. For her, humanity is the quality that makes the best porn.
“Porn has always been responsible for driving technology adoption in big ways,” she says, “So it’s not surprising that the adult industry is using new mediums like VR. But all the VR films I’ve seen so far completely miss the point! They are producing the same old silicone fantasy. Current VR porn is completely crowded with male-centric point-of-view clichés — mechanical sex, fake orgasms, no passion, no context and, of course, no intimacy.”
Contrary to the hopes of feminist smut producers such as Emma Lust, the futorologist Ian Pearson has apparently argued that the economics of porn will lead to the adult industry embracing A.I. and face swapping technology. Digitally created photorealistic faces, more beautiful than any pornstar could be superimposed on real actresses having sex. The author of the VOX article makes an interesting point in an attempted rebuttal of Pearson – porn viewers still demand real people, and that is why deepfakes is so popular. For her, deepfakes combine the two biggest porn trends in the age of free tubes – high production porn and amateur voyeruristic porn.
Whether or not the point about (‘illicit’) deepfakes satisfying a voyeuristic element is valid, there is no doubt that deepfakes can be done in an entirely legitamate way as described by Ian Pearson. At the moment, it’s easier to superimpose real photos (of celebrities) onto other (pornstar’s) bodies, especially for amateur redditors, but soon porn companies will be pouring millions into this technology and it will indeed save them money in the long run, enabling them to resuse existing content over and over again, as well as personalizing it to the specific desires of even individual potential customers.
In the same article, the author Kate Devlin, who has written an impressive defence of sex robots, also voices her fears over deepfakes and similarly tries to make the connection with revenge porn. However, she does at least express some positive feelings over A.I. and the future of porn, even if it is just to regulate what you will see :
Dr Kate Devlin does see an upside to the use of machine learning techniques in the porn world however: “Using machine learning to classify pornographic material could make it easier to apply filters, identify exploitation and authenticate sources. PornHub is already tagging material using machine learning and I’d love to see better tagged porn so it delivers exactly what you need at the right time. Machine learning could reveal aspects of your fantasies that you didn’t even realise you had.”
Meanwhile the BBC, published an article the same week with exactly the same title (https://www.bbc.co.uk/bbcthree/article/779c940c-c6c3-4d6b-9104-bef9459cc8bd). It tries to make the same connection between face swapping porn and revenge porn and asks whether forthcoming legislation will address the alledged problem :
While celebrities can call on expensive lawyers, and can potentially use defamation law to prosecute deepfakers (providing they can prove the image has caused, or is likely to cause, serious harm to their reputation), it can be a lot harder for ordinary people to take action.
In January, an Australian man was sentenced to 12 months in jail after Photoshopping his teenage stepdaughter’s face onto women engaged in sex acts, including bestiality, and there have been other similar cases in the UK.
But, says Luke Patel, a specialist in privacy law at Blacks solicitors, “The influx of fast-paced developments in technology makes it very difficult for laws to keep up and adequately support victims.”
The law currently makes no explicit reference to deepfakes though on 25 May, the General Data Protection Regulation (GDPR) will be implemented. It includes two new tools under the ‘Right of Erasure’ and the ‘Right to Be Forgotten’ which Luke believes could help “enable an individual to request the permanent removal or deletion of their personal data (including images) when there is no good reason for its continued publication” – though each case will be decided on an individual basis.
“It’s not an absolute right, but the case is stronger if the image is unwarranted and causes substantial distress. Although,” he continues, “they are still only tools that can be deployed when damage has already occurred.” They won’t stop it from happening in the first place.
The BBC does, however, rightly concede that there is no stopping this technology, and seems to hint at the impossibility and even dangers of the law trying to keep pace with it.
Creating deepfakes is becoming so easy it could become a party game: bring photos, booze, and, instead of watching YouTube videos, sit around and create stolen-identity porn.
“In a couple of years, you could be able to go to a porn app store and buy VR sex experiences with anyone you want,” says Evgeny Chereshnev, of BiolinkTech.
You can already buy incredibly realistic sex dolls. “Soon,” says Evgeny, “technology could allow someone to steal your identity and order a sex doll with your face on it.”
The thought of a future where ‘you’ could exist on someone’s laptop, or where images of ‘you’ could be created solely to be maliciously circulated – or where ‘you’ could even be sitting in sex doll form in the corner of someone’s bedroom – is deeply disturbing. It may already be here.
You can also read a brief discussion of the legal and moral aspects of face swapping technology, as part of a serious of tutorials explaining how deepfakes are created : https://www.alanzucconi.com/2018/03/14/the-ethics-of-deepfakes/