As AI ‘Deep Fake’ technology grows, a smartphone app called ‘DeepNude‘ that allowed users to upload any picture of a clothed woman and turn it into a realistic nude, has been pulled by its creator after media outrage.
Deep Fake technology uses machine algorithms to create fake visual images. It attracted attention last year with a subreddit that was putting the tech to use to create fake porn, transposing faces of celebrities into porn scenes. The real faces of the celebs would appear to move and share the expressions of the original pornstar whose bodies they were transposed on. After online outrage, including demands to make the tech illegal, the subreddit was shut down, although the technology progresses ‘legitimately’, albeit still controversially, by putting celebs into non-pornographic scenes, and for example putting words into the mouths of politicians to create realistic fake speeches.
Of course, deep porn tech works just as well with non-celeb women, and the ‘DeepNude’ app allowed men to upload a pic of their female work colleagues (the software didn’t work for men), neighbours, sisters, or ex-gfs. Predictably, outrage ensued and the app quickly went offline. At first, the creators claimed it was because their app servers were overwhelmed by demand (likely true), but as the controversy grew, quickly decided that ‘the world is not ready for DeepNude’. Now they are even claiming that the whole thing was a joke.
The world is certainly not ready for DeepNude it seems, but it equally appears difficult to see how this tech can be prevented from being distributed and used by millions in the long run.