In the last week, OpenAI revealed some more jaw dropping examples of their Sora video generator in action. But that wasn’t even the biggest development that had AI porn fans drooling with anticipation. No, that was the reported announcement from a chief tech person in OpenAI that nude videos might actually be permissable with Sora.
Despite these tantalizing glimmers of what might be possible very soon, my own pessimistic predictions remain unchanged as to the probability of a coming sledgehammer of legislation worldwide. I just can’t believe that governments are going to allow us to play with these toys. And today, my fears may already be coming true with the announcement that the likely next government of the UK is proposing not only a complete ban on nudify apps, but the forcing of big tech companies behind the likes of Sora and Stable Diffusion to ensure that they cannot be used for deepfake porn.
Labour Together’s proposals recognize that the majority of nudified images are created without their subject’s consent and are therefore illegal in the UK.
But the paper doesn’t just target app users. It also recommends introducing new obligations for AI developers and to help prevent general-purpose computer vision models from being abused. In addition, it proposes measures web hosting companies could be required to take to help ensure they aren’t involved in the production or distribution of harmful deepfakes
As far as I understand it, even in the UK, only the non-consensual sharing of nude deepfake images is illegal (although I am not a legal expert, and laws in the Uk change regularly). If carried through, this proposed law would likely be the first in the Western world to explicitly make deepfake porn even if for private use, illegal or proscribed. In the USA, the DEFIANCE act currently going through the Senate, potentially leaves open the possibility that AI porn generators or even the tech firms behind the original software such as Stable Diffusion, could be sued by victims of deepfake porn.
Now you may be thinking that deepfake porn is bad, so this is good news, and what has it got to do with Sora and AI porn image and video generation? Well, in truth, it may be difficult or close to impossible for an AI porn generator to be guaranteed to be free from the possibility of producing deepfake images. Furthermore, the proposed requirement for the original source software, such as Stable Diffusion, to be made deepfake porn will likely mean that the tech companies will just take an even stronger anti-porn and anti-nude approach when building their generators. There is also the question as to whether the definition of a deepfake porn image will be interpreted not just as that of an identifiable image, but any real person used in the AI’s training data. Even the left-wing Guardian newspaper asks whether the proposed law may have a chilling effect on AI image generation tech.
The policy paper also proposes a softer set of regulations for the wider tech sector that supports AI. Web hosts, search engines and payment platforms would be obliged to ensure their clients aren’t facilitating the creation of “harmful deepfakes”, backed up by fines from Ofcom. Critics, in turn, might object that such a policy could have a chilling effect: if “harmful” is in the eye of the beholder, then it may be easier for a platform to ban all deepfake tools entirely.
Although the policy paper, compiled by the British left-wing think tank ‘LabourTogether’, appears to focus on deepfake porn, it’s unlikely that left-wing feminists and Conservatives are likely to stop there. Not when tools like Sora will allow anybody to turn their sexual fantasies into photo-realistic videos within the next couple of years.
The UK, long famous for its Victorian ‘no sex please, we’re British’ attitudes, is arguably the most puritanical nation in the Western World. A recent survey by ControlAI found that support for laws against deepfake porn was the highest of any country (82%). The opposition Labour Party, which is behind the proposed law, is expected to win the next election (held later this year) by a landslide. In London, led by a Muslim Labour mayor, posters on the Metro showing bikini clad women were ordered to be torn down, while in their place are warnings that you can be arrested for’staring’ at female passengers.
The usual legislative response in the UK is to criminalize not only the production or sharing of porn deemed ‘harmful’, but the downloading or viewing of it as well. In this case, it is evident that the focus is on stopping the source of deepfake porn – the tools themselves. This is probably because prisons in the UK are already full, and jailing potentially tens of thousands of men for using a nudify app or creating private deepfake porn images is an impossibility.
Featured image generated with PornJoy.ai