DeepNudes Was An App That Created Naked Pictures Of Fully-Clothed Women. Wow! Respect Just Flew Out Of The Window
The thing about sexism is that while we take 10 steps forward to eradicate it, there are several ridiculous phenomenon around the the world that take us 20 steps backwards. Now, while the internet has been a dangerous place, more for women, since possibly its inception, a particular app called DeepNude had gone ahead and made the worst use of technology. This app uses Artificial Intelligence to strip women off their clothing. If you are appalled that that’s a thing, thank you!
You could basically upload a picture of any woman, and the app would undress her. The app doesn’t work on men, and is reserved to undress just women. Okay, we’re not saying men should go through this too but we are definitely saying this is a privilege we women don’t want. This is clearly a case of objectification of women, and violation of consent. Because anybody could pick a picture of yours from social media and create strikingly realistic-looking nudes. It is disturbing that there are so many people who would indulge in creating fake nudes of women. These are the same men who would want to rip off clothes of real women, but since they can’t afford to do that without the risk of getting caught, here comes a safer alternative. The app went viral before it got taken down because of all the flak it received.
View this post on Instagram
An AI app that “undressed” women shows how deepfakes harm the most vulnerable. What are your views on it? . Good news that it has been taken off now. . . . . . . . . . . #DeepNude #deepfake #deepfakes #manipulatedmedia #deepnudes #artificialintelligence #CrackingTheMachineLearningInterview #artificialintelligenceai #GAN #technology #technologynews #technews #techreview #machinelearning #deeplearning #ai #ml
It is possibly the most devastating use of AI, and the app clearly propagates non-consensual and even revenge porn on the internet. These DeepFakes were first discovered in 2017 by Motherboard/Vice’s Samantha Cole. And we are glad that because of the intense criticism the organisation threw their way, the app is finally down. “This is absolutely terrifying,” Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, expressed to Motherboard. “Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. This tech should not be available to the public.”
Which is why, we are left dumbfound. How can someone put effort, time, money and toil into creating such an ethically disgusting app? What are the ethics of the employees who worked on building it? And the people who use it? Honestly, we feel the users of this app are already normalising rape culture, and violating consent, even if virtually. Such a shameful use of tech. With this, we’ll be back to square one, with threats to women’s safety – just that the dangers will be more technically advanced. Ugh.