The Singularity, which is the idea that AI will progress exponentially to the point it will rapdily dwarf human intelligence, has long been mocked as a kind of ‘rapture for the nerds’. Just another quasi-religious millenial myth, given a ‘secular’ high-tech 21st century dressing. But if the Singularity does occur, it seems that the Super AI that will emerge might be as puritanical as the most sex hating Christian. This is because the White House this week forced a number of big tech AI companies, including OpenAI, to promise to remove nude images from the training data sets that their AI systems learn from.
The ostensible motivation for this pledge is the desire of the US government to stamp down on non-consensual deepfake porn and other forms of what campaigners have deemed to be ‘image based abuse’. It is, however, going to obviously impact upon the future development of NSFW image and video generators. Whilst only a select group of AI companies have agreed to the pledge, and obviously the likes of OpenAI already forbid NSFW erotic uses of their AI tools, it’s a worrying development that the US is now actively seeking to prevent nudity from even being used in the training of AI systems.
As reported in the Independent :
Several leading artificial intelligence companies pledged Thursday to remove nude images from the data sources they use to train their AI products, and committed to other safeguards to curb the spread of harmful sexual deepfake imagery.
In a deal brokered by the Biden administration, tech companies Adobe, Anthropic, Cohere, Microsoft and OpenAI said they would voluntarily commit to removing nude images from AI training datasets “when appropriate and depending on the purpose of the model.”
The White House announcement was part of a broader campaign against image-based sexual abuse of children as well as the creation of intimate AI deepfake images of adults without their consent.
Can an AI, no matter how intelligent, be regarded as human if it has no ‘experience’ of a naked man or woman? The most threatening thing about the possibility of an AI becoming more intelligent than us, is surely the fear that it will not be human. It will be different from us, and thus we will not be able to understand its motives, its thinking, and such like. To prevent such an inhuman super AI emerging, many have stressed the importance of ensuring that we embody ourselves, our humanity, into the building of AI. Keeping the eyes of AI off the naked human form, seems very much at odds with that understandable and worthwhile, and possibly imperative goal.
From Greek sculptures, to Renaissance art, the beauty of the naked human form, both male and female, has not only inspired genius, but through them taken humanity to the next level of civilizational development. The AI that ends up controlling humanity, might consider an uncovered table leg to be sinful.
16-Year-Old South Korean Boy Attempts Suicide After Arrest For Deepfake Porn
Meanwhile, a sixteen-year-old boy in South Korea had to be talked out of jumping from a tall building, after being questioned by the police for watching and sharing deepfake porn (not creating it). South Korea is in the midst of an all-mighty moral panic over AI-generated deepfake images. New harsh laws that promise years in prison for the creation of such material have been promised. South Korean females have been rushing to take down their selfies from Social Media accounts, for fear that perverts will use them to generate deepfake porn. According to statistics, 93% of deepfake porn suspects in the country are teenage boys.
On the morning of September 6, authorities received an emergency call from a concerned citizen reporting that A was preparing to jump from the roof of a high-rise building in Andong City. When police and firefighters arrived at the scene, A was standing on top of a water tank on the roof, repeatedly expressing his deep guilt and hopelessness, saying, “I’ve committed a grave sin against society. I don’t want to live anymore.”
South Korea has one of the highest suicide rates in the world, with young males killing themselves at over twice the rate of young females. There have been several high-profile instances of K-Pop stars committing suicide in the last couple of years. South Korea has also long been home to the world’s most thriving female community of ‘fan fiction’ groups, in which tens of thousands of young women create and share erotic and obscene fantasies involving real male celebrities, often including graphic rape descriptions.