Boosting gender identification using author preference

Kucukyilmaz T., Deniz A., Kiziloz H. E.

PATTERN RECOGNITION LETTERS, vol.140, pp.245-251, 2020 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 140
  • Publication Date: 2020
  • Doi Number: 10.1016/j.patrec.2020.10.002
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Social Sciences Citation Index (SSCI), Scopus, Academic Search Premier, Applied Science & Technology Source, Compendex, Computer & Applied Sciences, INSPEC, zbMATH
  • Page Numbers: pp.245-251
  • Keywords: Gender identification, Text classification, Authorship attribution, Machine learning, Gender-swapping, Virtual gender, COMPUTER, DISCOURSE, FEATURES, DIALECT
  • TED University Affiliated: Yes


Predicting the gender of a text document's author, also known as gender identification, is a well-studied authorship categorization task in the literature. A common theme in gender identification studies is that gender is considered a binary task. However, digital communications provide users with the ability to select virtual genders leveraging physical anonymity. In this study, the additional duality on gender due to author preferences is examined along with the biological gender. Formally, the objective of this paper is to investigate whether the gender preference of an author contains any additional linguistic information. Furthermore, we explore whether this information can be exploited to improve the author characterization task. In particular, the self-assigned gender, i.e., virtual gender, of the users in text-based real-time online messaging services, along with the biological sex, is evaluated quantitatively via comparing/assessing the gender prediction performance under various settings. Experiment results show that by integrating the virtual gender into the binary classification problem of predicting an author's gender, it is possible to further improve the prediction performance by 2.6%, up to 85.4%. (c) 2020 Elsevier B.V. All rights reserved.