Date of Award

9-6-2024

Publication Type

Thesis

Degree Name

M.Sc.

Department

Computer Science

Supervisor

Hossein Fani

Abstract

Neural team recommendation has brought state-of-the-art efficacy while enhancing efficiency at forming teams of experts whose success in completing complex tasks is almost surely guaranteed. Yet proposed methods overlook diversity; that is, predicted teams are male-dominated and female participation is scarce. To this end, pre- and post-processing debiasing techniques have been initially proposed, mainly for being model-agnostic with little to no modification to the model's architecture. However, their limited mitigation performance has proven futile, especially in the presence of extreme bias, e.g. 5% female experts in the training datasets, urging further development of in-process debiasing techniques. In this work, we are the first to propose an in-process gender debiasing method in neural team recommendation via a novel modification to models' conventional cross-entropy loss function. Specifically, (1) we dramatically penalize the model (i.e., an increase to the loss) for false negative female experts, and meanwhile, (2) we randomly sample from female experts and reinforce the likelihood of female participation in the predicted teams, even at the cost of increasing false positive females. Our experiments on a benchmark dataset withholding extreme gender bias demonstrate our method's competence in mitigating neural models' gender bias while maintaining accuracy, resulting in diverse yet successful teams.

Share

COinS