Konferenzbeitrag
Fairness and Privacy in Voice Biometrics: A Study of Gender Influences Using wav2vec 2.0
Lade...
Volltext URI
Dokumententyp
Text/Conference Paper
Dateien
Zusatzinformation
Datum
2023
Autor:innen
Zeitschriftentitel
ISSN der Zeitschrift
Bandtitel
Quelle
Verlag
Gesellschaft für Informatik e.V.
Zusammenfassung
This study investigates the impact of gender information on utility, privacy, and fairness in voice biometric systems, guided by the General Data Protection Regulation (GDPR) mandates, which underscore the need for minimizing the processing and storage of private and sensitive data, and ensuring fairness in automated decision-making systems. We adopt an approach that involves the fine-tuning of the wav2vec 2.0 model for speaker verification tasks, evaluating potential gender-related privacy vulnerabilities in the process. An adversarial technique is implemented during the fine-tuning process to obscure gender information within the speaker embeddings, thus bolstering privacy. Results from VoxCeleb datasets indicate our adversarial model increases privacy against uninformed attacks (AUC of 46.80\%), yet slightly diminishes speaker verification performance (EER of 3.89\%) compared to the non-adversarial model (EER of 2.37\%). The model's efficacy reduces against informed attacks (AUC of 96.27\%). Preliminary analysis of system performance is conducted to identify potential gender bias, thus highlighting the need for continued research to understand and enhance fairness, and the delicate interplay between utility, privacy, and fairness in voice biometric systems.