Machine translation, especially free, publicly available software like Google Translate, can make it easier to communicate – but not without a lot of potential mistakes and misunderstandings. One issue that goes beyond individual words is gender bias.
This can mean anything from generating a majority of sentences with male subjects, to defaulting to the most common gender associated with certain job titles.
The result not only has problematic social implications, but also heightens the risk of translation inaccuracies.
Google Translate found a way to work around this challenge by posting the masculine and feminine noun or adjective for any word in a gendered language. Now, researchers at Cambridge University are looking at additional solutions for other machine translation tools.
Read on to learn more about teaching machines to give up their gender bias. While it’s more difficult to teach this to certain humans, it’s still a formidable challenge.
Contact Our Writer – Alysa Salzberg