We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.

Durham Research Online
You are in:

Towards Equal Gender Representation in the Annotations of Toxic Language Detection

Excell, Elizabeth and Al Moubayed, Noura (2021) 'Towards Equal Gender Representation in the Annotations of Toxic Language Detection.', 3rd Workshop on Gender Bias in Natural Language Processing (GeBNLP2021), International Joint Conference on Natural Language Processing (INCNLP2021) Bangkok, Thailand, 1-6 Aug 2021.


Classifiers tend to propagate biases present in the data on which they are trained. Hence, it is important to understand how the demographic identities of the annotators of comments affect the fairness of the resulting model. In this paper, we focus on the differences in the ways men and women annotate comments for toxicity, investigating how these differences result in models that amplify the opinions of male annotators. We find that the BERT model associates toxic comments containing offensive words with male annotators, causing the model to predict 67.7% of toxic comments as having been annotated by men. We show that this disparity between gender predictions can be mitigated by removing offensive words and highly toxic comments from the training data. We then apply the learned associations between gender and language to toxic language classifiers, finding that models trained exclusively on female-annotated data perform 1.8% better than those trained solely on male-annotated data, and that training models on data after removing all offensive words reduces bias in the model by 55.5% while increasing the sensitivity by 0.4%.

Item Type:Conference item (Paper)
Full text:(AM) Accepted Manuscript
Download PDF
Publisher Web site:
Date accepted:23 June 2021
Date deposited:21 July 2021
Date of first online publication:03 August 2021
Date first made open access:07 August 2021

Save or Share this output

Look up in GoogleScholar