“AI Learns Sexism Just by Studying Photographs”

Two articles, one from Wired and one from the MIT Technology Review on bias in software. the quotes below, on gender bias.

As sophisticated machine-learning programs proliferate, such distortions matter. In the researchers’ tests, people pictured in kitchens, for example, became even more likely to be labeled “woman” than reflected the training data. The researchers’ paper includes a photo of a man at a stove labeled “woman.”

 

Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

These are interesting to me for several reasons.

First, it assumes that there is a bias against the kitchen, or women in the kitchen. But when one considers that most top chefs (Michelen 5 star) are men, it isn’t just about women in the kitchen, where the bias exists, but on a different level, and perhaps a nuance the machines don’t yet grasp.

Second, they are labeled as the AI learning sexism. I would be more inclined to suggest that the AI learned American categorization and labeling systems. The humans added the value judgement. I wonder why the machine was looking at images and labeling gender. How does a computer understand man/woman? By what criteria is it taught to differentiate?  Which inevitably brings us to the discussion of who creates the algorithms and programs, chooses the data sets, and teaches the labels to the machines.

It feels like solving an issue of ‘distortion’ in the way a machine learns, if that machine is reflecting both the programmers and the society, isn’t a solve, if it’s machine-level only. This is, perhaps, not the entire conversation, or even the wrong conversation.

It makes me think we need a deeper discussion on what the AI sees, how it applies labels, and how humans both interpret them and understand them. It reminds me of Lombroso and Lacassagne. Are we on our way to recreate the past, with different parameters?

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *