Obfuscating Language

Google announced that instead of solving bias issues in their algorithms, they will simply stop using words that make that bias explicit.

It is unclear how opting to not label an image with a word that denotes gender is going to solve any problem that is currently caused by the assumption of gender based on visuals.

So many forms in the modern world require us to tick gender, and we know that is has impact. Think of the to-do over the Apple Card and the men who received better terms and higher credit, based on being male. It was revealed that this was embedded in the algorithm. So sure, we can go back and remove gender markers from algorithms, but the question is, what are we trying to do here?

Men tend to have higher insurance rates, as they have more car accidents and deaths-by-misadventure. Which makes one wonder to what end we’ve all been reduced to maths, and where it is valuable and where it is not.

Either way, short of never calling out gender, and allowing it to have no impact on any calculations of any sort, refusing to allow an AI to choose the gender of a human in a photo seems a stop-gap measure.

I am reminded of a linguistics paper I was reading, on the gender of animals in nature, and the prevalence, in English, to call most animals ‘he’ unless we have clear evidence to the opposite. An interesting bias, given we don’t have a gendered language, such as a romance language, where this would be the default.

Duolingo, for example, always defaults male, in every sentence, and does not explain that, in the early levels of their romance language apps. If you see a cat and type gatta instead of gatto, it’s an error. Which is a pretty hefty bias, in my opinion. It would be lovely if they just swapped the default sentences, not just about cats, to the feminine.


Leave a Reply

Your email address will not be published. Required fields are marked *