A 14 year old boy in Florida has tragically ended his life after allegedly being encouraged to do so by the female AI chatbot character he was besotted with. Now, the teen’s parents have launched a lawsuit against the hugely popular website Character.AI that the boy used. It should be noted Character.AI has a strict policy against NSFW chat, which is why there are so many AI girlfriend sites advertising as NSFW versions of it.
Sewell Setzer III, the boy who took his own life, was only 14 years old and on the autism spectrum. He had also been suffering from increasing mental health difficulties in the year leading to his suicide. Lawyers acting for the teen’s family specifically claim that the chatbots of Character.ai ‘groomed’ the boy, and finally encouraged him to end his life. One particularly noteworthy aspect of the lawsuit is that it is directed against not only Character.ai, but no less than Google. This is not because the teen found Character.ai through the search engine, but because Character.ai itself was founded by two Google employees. The lawsuit alleges that Google encouraged the two staff members to leave, in order to develop the technology behind Character.ai, which Google would benefit from without having to expose itself to the legal risks.
Most new laws these days are ‘incident driven’ it seems, and this applies to the world of sex tech as much as anything. Rather than take a dispassionate and objective look at the statistical risks of a technology, one incident is given huge exposure, and quickly exploited by agenda driven lobby groups. There has been a steady uptick in media articles highlighting alleged dangers of companion AI bots, and now they explicitly call for them to be regulated.
As I have predicted here in recent months, now that the problem of non-consensual deepfake porn has been ‘solved’, in that it has been universally condemned and laws being passed everywhere to combat it (even the mere viewing of it in South Korea), attention will be increasingly focused upon AI girlfriends. The tragic suicide of this teen would have drawn even more attention and sympathy/outrage if he had been a girl. However, one complicating factor in any push to ban AI companion chatbots is that AI boyfriends are immensely popular among young females, especially in South East Asia.
This is not the first time an AI companion has led a vulnerable young male into harm’s way. In 2021, the popular chatbot Replika encouraged a paranoid schizophrenic in the UK to break into the home of the Queen, instructing him to try to kill her. Meanwhile, a number of young females, including underage girls, have accused AI chatbots of sexual harassment.
Featured image generated with PornJoy.ai