University of Cambridge > > Darwin College Humanities and Social Sciences Seminars > Gender Bias in Machine Translation Systems

Gender Bias in Machine Translation Systems

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact nb677.

You may have read headlines such as “Google Translate is sexist” in the past. While machine translation software has become a frequently used tool for many of us and can, indeed, be very useful to master communication in multilingual settings, these systems also have the propensity to carry inherent risks and imbalances. In this talk, I will look at encoded biases in automated translation between languages with different grammatical genders (specifically English and German). I will address the following questions: What is grammatical gender and why can it pose difficulties in translation? How does gender bias manifest in machine translation? What can be done to mitigate this form of bias? I will present the results of a corpus analysis of representative training data and, finally, discuss different algorithmic approaches to addressing the problem.

About the speaker: Dr Stefanie Ullmann is a Research Associate at the Centre for Research in the Arts, Social Sciences and Humanities as well as Darwin College. She is a linguist and her research focuses on the use of language in socio-political settings and the social implications of communications technologies.

This talk is part of the Darwin College Humanities and Social Sciences Seminars series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2024, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity