AI Transforms Pictures Of Clothed Women Into Nudes That Are Realistic. Nope, The Women Don’t Know Their Pictures Are Being Used
Not to sound like I am throwing a pity party for myself and all the women out there, but chances are that if you are a woman, life hasn’t been easy for you at all. Yes, struggle is a part of everyone’s life, even the ones born with a silver spoon in their mouths, but when you face discrimination since before you were even born, you know that not much will change after you’re born either. But you learn to fight, against norms, society, archaic traditions. But how do you even battle an unseen enemy? An artificial intelligence that robs you of your dignity without your permission. We’re talking about an AI that comes with no better intention than to further the crappy intentions of these perverts.
Yes guys, you heard that right, even technology has turned hostile now. This AI has already taken about 100,000 women’s images to turn them into nudes on the internet. Picking these pictures of innocent and unaware women, without consent, a this “service”, freely available on the internet, takes the pictures and removes the clothing from the picture. And right now, this is the latest threat to women’s safety and privacy.
These deepfakes that it creates morphs women’s images wearing clothes and once uploaded, converts them into convincing nude pictures. This is not just a worrying concern because of the gross invasion of privacy or because our consent is being violated but also can be used as arsenal into blackmailing women who wouldn’t know any better than to succumb to the idle threats, with their self respect on the line.
The way it works, the user (anonymously) can submit photos of a clothed woman into the software, and then receive an altered and edited version of the picture, one which doesn’t have clothes and comes with a rather alarming accuracy. The software takesn into account nudes from around the world and uses this to make the picture very lifelike, from the hair, the skin tone, the body type etc.
Also Read : Study Says Nearly 60% Women Suffer Online Abuse And It’s Driving Them Away From Social Media
In fact, the AI works really hard to work the authenticity factor in, so the women’s pictures are neither edited nor retouched. As close to reality as possible, apparently, except the woman has no idea. They don’t even hide the women’s face to protect their identity. Not to forget, the age of the girls in the photographs go even younger than 18 years old. And whether or not minors were involved, this is pretty worrying.
The AI’s biggest subscriber base is in Russia and when interviewed by The Post, said that this was ‘harmless sexual voyeurism.’ Oh phew, not a worry then, we guess.
Considering, this too had to have been designed by some sort of desperate degenerate out there, the AI was spotted by researchers at Sensity, an Amsterdam-based cybersecurity start-up, that later shared its findings with others. They revealed that up until then, almost 100,000 users had accessed the app, and almost 63% of the images sent in for editing, were of women people knew of in real life. And if that isn’t scare the living day lights out of you, I don’t know what will.
Giorgio Patrini, the Sensity group’s chief executive commented on the matter and said, “The fact is that now every one of us, just by having a social media account and posting photos of ourselves and our lives publicly, we are under threat. Simply having an online persona makes us vulnerable to this kind of attack.” She was right, all we now needed to be under attack was an online presence, that would allow for sickos to use our own harmless pictures against us.
It’s just entertainment! – More than 100,000 of faked nude images of women have been created from social media pictures and shared online. Clothes digitally removed from pictures of women by Artificial Intelligence (AI) platform Telegram #deepfakes #tech https://t.co/EMS089rRqU
— F Anderson (@FLAnderson) October 21, 2020
Really strange how digital avatars & deepfakes and the laws around licensing the personal digital representations will be largely driven by the visceral reaction stemming from toxic male behavior and “adult” industry use cases | https://t.co/38smnorTCg
— COPE (@Cope84) October 20, 2020
It is disheartening to see perpetrators try so hard to bring women down and harass us every step of the way. We hope that such technology is not just scrapped, but the makers behind such disastrous AI are not left off the hook. Hiding behind anonymity and claims where they do not hold themselves responsible for the harassment being done to women through their AI, their desire to serve men with a digital “x-ray” vision is all that is wrong with the society.
In fact, the very logo of the AI that shows a smiling man and a woman being ogled by X-ray glasses, is pretty telling of exactly problematic these services really are.
I guess what we are asking is, are we women really safe anywhere at all?