Auflistung nach Schlagwort "Demographic bias"
1 - 3 von 3
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragFairness and Privacy in Voice Biometrics: A Study of Gender Influences Using wav2vec 2.0(BIOSIG 2023, 2023) Oubaida Chouchane, Michele PanarielloThis study investigates the impact of gender information on utility, privacy, and fairness in voice biometric systems, guided by the General Data Protection Regulation (GDPR) mandates, which underscore the need for minimizing the processing and storage of private and sensitive data, and ensuring fairness in automated decision-making systems. We adopt an approach that involves the fine-tuning of the wav2vec 2.0 model for speaker verification tasks, evaluating potential gender-related privacy vulnerabilities in the process. An adversarial technique is implemented during the fine-tuning process to obscure gender information within the speaker embeddings, thus bolstering privacy. Results from VoxCeleb datasets indicate our adversarial model increases privacy against uninformed attacks (AUC of 46.80\%), yet slightly diminishes speaker verification performance (EER of 3.89\%) compared to the non-adversarial model (EER of 2.37\%). The model's efficacy reduces against informed attacks (AUC of 96.27\%). Preliminary analysis of system performance is conducted to identify potential gender bias, thus highlighting the need for continued research to understand and enhance fairness, and the delicate interplay between utility, privacy, and fairness in voice biometric systems.
- KonferenzbeitragGeneralizability and Application of the Skin Reflectance Estimate Based on Dichromatic Separation (SREDS)(BIOSIG 2023, 2023) Joseph A Drahos, Richard PleshFace recognition (FR) systems have become widely used and readily available in recent history. However, differential performance between certain demographics has been identified within popular FR models. Skin tone differences between demographics can be one of the factors contributing to the differential performance observed in face recognition models. Skin tone metrics provide an alternative to self-reported race labels when such labels are lacking or completely not available e.g. large-scale face recognition datasets. In this work, we provide a further analysis of the generalizability of the Skin Reflectance Estimate based on Dichromatic Separation (SREDS) against other skin tone metrics and provide a use case for substituting race labels for SREDS scores in a privacy-preserving learning solution. Our findings suggest that SREDS consistently creates a skin tone metric with lower variability within each subject and SREDS values can be utilized as an alternative to the self-reported race labels at minimal drop in performance. Finally, we provide a publicly available and open-source implementation of SREDS to help the research community. Available at https://github.com/JosephDrahos/SREDS
- KonferenzbeitragRobust Sclera Segmentation for Skin-tone Agnostic Face Image Quality Assessment(BIOSIG 2023, 2023) Wassim Kabbani, Christoph BuschFace image quality assessment (FIQA) is crucial for obtaining good face recognition performance. FIQA algorithms should be robust and insensitive to demographic factors. The eye sclera has a consistent whitish color in all humans regardless of their age, ethnicity and skin-tone. This work proposes a robust sclera segmentation method that is suitable for face images in the enrolment and the border control face recognition scenarios. It shows how the statistical analysis of the sclera pixels produces features that are invariant to skin-tone, age and ethnicity and thus can be incorporated into FIQA algorithms to make them agnostic to demographic factors.