If you thought deepfakes were bad, here’s something worse: Washington Post reports that users of a new onlne service “can anonymously submit a photo of a clothed woman and receive an altered version with the clothing removed. The AI technology, trained on large databases of actual nude photographs, can generate fakes with seemingly lifelike accuracy, matching skin tone and swapping in breasts and genitalia where clothes once were. The women’s faces remain clearly visible, and no labels are appended to the images to mark them as fake. Some of the original images show girls younger than 18.”
Before you ask no, the AI can’t do the same with men. It’s only set up to declothe women and will give male images female body parts. The article says that’s partly because AI research is male dominated so they don’t have any women working on projects like this who can say “What the hell are you thinking?” But I also think misogyny plays a part: “The bot’s administrator, speaking in Russian, told The Post in a private chat on Monday that they didn’t take responsibility for how requesters used the software, which they argued was freely available, anyway. ‘If a person wants to poison another, he’ll do this without us, and he’ll be the one responsible for his actions,’ the administrator wrote.” A)This is not poison or a gun or a car, something that’s freely available; B)what they are doing with this service is exactly what it’s designed to do — strip women naked.
The administrator also resorted to that old Undead Sexist Cliche, why was she wearing those clothes? “A girl who puts a photo in a swimsuit on the Internet for everyone to see — for what purpose does (she do) this?” Hmm, possibly because she was at the beach and she wanted to share the event with her friends? And even if she posted because she likes how sexy she looks, so what? This does not translate into “since she likes to look sexy, therefore it’s acceptable to faked naked photos of her,” any more than it’s an excuse for rape.
To their credit some AI researchers have called out this kind of shit. One developer took down an app they’d made with similar capabilities because the potential for abuse was too high. Other people, however, adopt the kind of laissez-faire attitude of the administrator — hey, this is cool tech, who cares what happens with it in the real world?
My opinion of such people is not high.