South Korea Makes Deepfake Porn A Sex Crime

By | May 26, 2020
Kpop star protected by new deepfake porn law in South Korea

Kpop celebs will in future be protected by law from the algorithmic bad designs of deepfake porn nerds

According to a report in the Korea JoongAng Daily, South Korea is set to become the first country in the world to turn ‘deepfake porn’ into a sexual offence. Although a number of American States have introduced legislation already, and China has passed a law against the technology, South Korea is believed to be the first nation to specifically make the production of deepfake porn a sex crime.

A revision of the Act on Special Cases Concerning the Punishment, Etc. of Sex Crimes passed the National Assembly and goes into effect on June 25. According to the new law, those who produce or distribute videos that fabricate, manipulate, edit or copy a person’s face, body or voice against their will can be punished with up to five years in prison or charged a fine of up to 50 million won ($40,500). If the act was done for the purposes of earning money, then the prison sentence can be extended up to seven years.

Until the amendment, deepfake pornography was not dealt with as a sex crime in the eyes of the law and those behind the producing of the content were accused of either breaching the Act on Promotion of Information and Communications Network Utilization and Information Protection, Etc. for displaying or distributing “obscene content” or of libel. The amendment does not, however, punish anyone for owning or watching deepfake pornography. And if the video was produced overseas, law enforcement officers have little they can do.

“The amendment only punishes the act of producing and spreading deepfakes,” the Korean Bar Association said in a statement.

So viewing deepfake porn is still safe in South Korea it appears, although that will likely be different in other countries considering similar legislation, including the UK, where viewing/posessing illicit pornography is usually a crime itself, and where a radical feminist professor of law is leading calls for laws regulating all kinds of digital porn.

“We must overhaul our out-of-date and piecemeal laws, including criminalising the paralysing and life-threatening impact of threats, and recognising the significant harms of fake porn,” said one of the authors, Prof Clare McGlynn.

The report said the lack of specific laws covering the issue in England, Wales and Northern Ireland meant the police were often only able to give an informal warning. However, it noted that the law in Scotland was better at covering cases involving altered imagery.

Last month, the Ministry of Justice asked the Law Commission to review the issue in England and Wales. However, the independent body is not due to report back until the summer of 2021.

“While I welcome the Government’s recognition of the need for a comprehensive review of the law, we know that deepfake pornography is harming people right now and any delay means justice delayed,” commented Prof McGlynn.

Of course, the more digital technology pushes porn, the more moral issues there are, and the more risks of harm (usually for women). Of that, there isn’t any disputing. The problem, as far as I see it, is that with the increasing merging of the digital and the physical, unless these new laws are very carefully thought out, then there presents itself the dangers of legislative overreach, and the potential to turn ever greater numbers of people (usually men) into thought criminals.

For example, this new South Korean law has largely been in response to deepfake porn involving K-Pop celebrities. Virtually every man alive (and likely every woman) must have fantasized and masturbated at the thought of a celebrity at some point, especially in their youth. Why is this not equally a ‘manipulation of an image’, even if that image is only in the individual’s head? The point is that this South Korean law makes illegal even the production for (presumably) private use, as well as the sharing of the fake images. And as stated above, countries like the UK will likely make merely the ownership or viewing of such images illegal. What happens when the ‘neurallinks’ and computer/brain interfaces imagined by the likes of Elon Musk become common or unbiquitous?

The same blurring of the physical and digital, private and public, will happen when augmented reality has become ubiquitous. Already young women are ‘manipulating’ their own image through AR filters on Snapchat and Instagram. When we are all wearing AR glasses, or AR contact lenses, or augmenting our external reality through all manner of weird and wonder ways, directly through those neurallinks, such laws could become a license to effectively make sexual thoughts illegal.

I’m currently reading 1984, so perhaps I’m in a paranoid frame of mind. In one part of the book, the main character – ‘Winston’ – speaks of how the automatic responses of one’s own nervous system could be enough to ‘betray you’ to the thought police. He was referring to the omnipresent eyes of Big Brother, and the State watching for signs (through telescreens in the home and other means of constant surveillance) of ‘thought crime’ that could be betrayed even by a nervous twitch and the like. When the nervous system is wired up to a computer, including I assume, one’s sexual responses to stimulation, George Orewll’s fears would take on a whole new and literal meaning.

Sex tech writers have often repeated the cliche that porn drives forward mainstream technology. Perhaps they don’t state often enough, the equally true observation, that it also drives forward badly though out and even Orwellian regulation of both technology and sexual desire.

Further reading :

What Are Deepfakes and How Are They Created?

World’s First Deepfake Audit Counts Videos and Tools on the Open Web

Leave a Reply