AI Uses Tags Like ‘Official’ To Describe Men, Whereas Women Are Being Recognised For Their ‘Skin’ And ‘Smile’. What Even?

AI Uses Tags Like ‘Official’ To Describe Men, Whereas Women Are Being Recognised For Their ‘Skin’ And ‘Smile’. What Even?

Because I am a woman, there are several adjectives that have been attributed to my personality by society, most of them without entirely true and/or irrelevant. Being reduced to a bunch of words that seldom go beyond our looks, women are many a times only judged by their appearance. And while up until now we only had the patriarchal society to blame, turns out now computers too are joining the club in making rather prejudiced conclusions about women. We say this after learning how artificial intelligence has categorised pictures of men under labels like “official” whereas women under labels such as, “smile” and “chin.”

As if the fact that half of the world and people living in it weren’t enough, now artificial intelligence is chiming in the stereotypes too and this development has been rather unsettling. We imagine it’s got a lot to do men doing the coding of theses systems but we are going to let that slide.

It was noticed after US and Europeaan researchers submitted pictures of congressmembers to Google’s cloud image recognition service only to receive results that were quite alarming. Carsten Schwemmer, a postdoctoral researcher at GESIS Leibniz Institute for the Social Sciences in Köln, Germany spoke on the matter and said, “It results in women receiving a lower status stereotype: that women are there to look pretty and men are business leaders.” So artificial intelligence is basically being a sexist asshole? Yes!

Also Read : A Company In Japan Has Come Up With Spray-On Artificial Skin And While Interesting, We Think We Will Pass

Even though the AI services witnessed in the pictures what any other person would, what they picked on were characteristics that were solely revolving around appearance for women, and around caliber for men. Almost as if the patriarchal society had cloned its mindset into the system. Tags used for women included “girl” and “beauty.”

It further implied that algorithms used by AI services do not see the world with mathematical detachment after all, but instead get influenced by historical cultural biases. The study was conducted after being inspired  by a 2018 project called Gender Shades that showed how the AI services were very on point at identifying the gender of white men, but very inaccurate at identifying the gender of black women. So there is racism and sexism? Wow.

Out of all the 20 lawmakers whose photos were fed to the system (all of them smiling), only one man was recognized for his smile as compared to all the women who were tagged for a smile. In fact, men were rather put under labels like  “businessperson,” often also with “official” or “white collar worker,” while women were still being tagged for their “neck”, “skin,” “hairstyle,” etc.

It is important now more than ever that such problems in algorithms that are ultimately man-made, be tweaked and fixed at the earliest, so as to not pass down the sexist and stereotypical consequences of it down to the next generation. After all, it is more than evident as to how women are much more than just their face or bodies. They are neck to neck with men in the world, and it’s time people and AI alike, address them for their potential, not appearance.

Also Read : AI Transforms Pictures Of Clothed Women Into Nudes That Are Realistic. Nope, The Women Don’t Know Their Pictures Are Being Used

 

Sadhika Sehgal

Read More From Sadhika
Seen it all?

We’ve got more!