"Gender, Bias and Language Technology"
by Dr Eva Vanmassenhove, Tilburg University
Abstract
Natural Language Processing tools are increasingly popular, making it vital for researchers to identify the potential role they play in shaping societal biases. Recent work showed how NLP technology can not only propagate but also exacerbate bias encountered in training corpora. During this talk, we aim to explore bias sources, how it can affect our technology and what we can possibly do to mitigate gender bias.
A special focus is given to the field of Machine Translation, where, due to contrastive linguistic differences, gender bias can become very apparent. When translating from one language into another, original author traits are partially lost, both in human and machine translations. However, in the field of MT one of the most observable consequences of this missing information are morphologically incorrect variants due to a lack of agreement in number and gender with the subject. Such errors harm the overall fluency and adequacy of the translated sentence.
Speaker’s bio
Eva Vanmassenhove is an Assistant Professor in the Department of Cognitive Science and Artificial Intelligence at Tilburg University who obtained her PhD from Dublin City University, Ireland. She works on the integration of linguistic features into Neural Machine Translation (NMT), focusing recently on issues related to gender bias and the loss of diversity in language due to statistical/algorithmic bias.