Just when regulators are starting to get to grips with the threat of deepfake porn, it seems that something even more intimate than our bodies may already be at risk of being cloned and manipulated – our very selves. Last week it was revealed that a developer by the name of Enias Cailliau has built software that can turn the personality of anybody you desire into an AI companion that can act as your virtual girlfriend. His project, which he cheekily calls ‘GirlfriendGPT’, has already been used to clone his own real-life partner (with her consent). But now that he has posted the code on to GitHub, one can only wonder where it might potentially all lead.
Cailliau told Motherboard that to make this bot, he first created a large language model framework that was customized to reflect his girlfriend, Sacha’s, personality. Cailliau said he used Google’s chatbot Bard to help him describe her personality. Then, he used ElevenLabs, an AI text-to-speech software, to mimic his girlfriend’s voice. He also added a selfie tool into the code that was connected to the text-to-image model Stable Diffusion that would generate images of her during the conversation. Finally, Cailliau connected it all to Telegram using an app called Steamship, which is also the company he works at.
Cailliau said he used his girlfriend as a template because he is most familiar with her behavior and likeness. While some may find the idea of turning a human partner into an on-demand AI clone unsettling, Cailliau said the real Sascha was 100 percent on board with the project and was fascinated by the ability to clone herself. However, they both agreed that the voice of the bot is not yet completely accurate.
“He asked me as I was leaving for the swimming pool with Lizzy (our daughter) and I told him ‘Yup! Let’s do this!’ Enias has been talking about AI companions for weeks now so I found it cool that he wanted to try to clone me instead of some random influencer online,” Sacha Ludwig, Cailliau’s girlfriend, told Motherboard.
In fact, last month when the media was ablaze with the story of Caryn Marjorie becoming the first influencer to be cloned into a chatbot AI girlfriend, I speculated that it would only be a matter of time before forums and sites appeared offering deepfaked celebrity personalities as virtual girlfriends.
But as such ‘datebots’ based upon real celebrities and influencers begin to quickly proliferate, we can no doubt expect some hysteria and moral panics to be generated in their wake. And no doubt legislators and lobbyists will soon have ammunition for their cause, as what is to stop the personalities of female celebrities or even ordinary women being ‘deepfaked’ for dating and sexual purposes, just as their images can be used for deepfake porn? As mentioned, the company behind CarynAI has already recreated AI personalities of individuals such as Kayne West without their approval. No doubt, any unauthorized use of a celebrity as a virtual boyfriend or girlfriend like CarynAI would result in legal action against a company such as Forever Voices. But just as there are deepfake porn forums that host images of celebrities, no doubt they will soon be hosting their personalities too.
It would appear that we are already living in a world in which any suitably tech-savvy individual can train an AI system on the social media postings, including images, videos, and texts, of any individual, and clone them into an AI virtual lover. Sex tech is sure moving fast these days. There was a time not so long ago when I would struggle to find one or two interesting stories a week to write upon. Now most days, I’m deciding what to leave out. Excuse me, but the Apple VR/AR headset is about to be revealed on their livestream and I don’t want to miss that…