Logo des Repositoriums
 

Transferability Analysis of an Adversarial Attack on Gender Classification to Face Recognition

dc.contributor.authorRezgui, Zohra
dc.contributor.authorBassit, Amina
dc.contributor.editorBrömme, Arslan
dc.contributor.editorBusch, Christoph
dc.contributor.editorDamer, Naser
dc.contributor.editorDantcheva, Antitza
dc.contributor.editorGomez-Barrero, Marta
dc.contributor.editorRaja, Kiran
dc.contributor.editorRathgeb, Christian
dc.contributor.editorSequeira, Ana
dc.contributor.editorUhl, Andreas
dc.date.accessioned2021-10-04T08:43:43Z
dc.date.available2021-10-04T08:43:43Z
dc.date.issued2021
dc.description.abstractModern biometric systems establish their decision based on the outcome of machine learning (ML) classifiers trained to make accurate predictions. Such classifiers are vulnerable to diverse adversarial attacks, altering the classifiers' predictions by adding a crafted perturbation. According to ML literature, those attacks are transferable among models that perform the same task. However, models performing different tasks, but sharing the same input space and the same model architecture, were never included in transferability scenarios. In this paper, we analyze this phenomenon for the special case of VGG16-based biometric classifiers. Concretely, we study the effect of the white-box FGSM attack, on a gender classifier and compare several defense methods as countermeasure. Then, in a black-box manner, we attack a pre-trained face recognition classifier using adversarial images generated by the FGSM. Our experiments show that this attack is transferable from a gender classifier to a face recognition classifier where both were independently trained.en
dc.identifier.isbn978-3-88579-709-8
dc.identifier.pissn1617-5468
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/37447
dc.language.isoen
dc.publisherGesellschaft für Informatik e.V.
dc.relation.ispartofBIOSIG 2021 - Proceedings of the 20th International Conference of the Biometrics Special Interest Group
dc.relation.ispartofseriesLecture Notes in Informatics (LNI) - Proceedings, Volume P-315
dc.subjectTransferability
dc.subjectadversarial attacks
dc.subjectgender classification
dc.subjectface recognition
dc.titleTransferability Analysis of an Adversarial Attack on Gender Classification to Face Recognitionen
dc.typeText/Conference Paper
gi.citation.endPage136
gi.citation.publisherPlaceBonn
gi.citation.startPage125
gi.conference.date15.-17. September 2021
gi.conference.locationInternational Digital Conference
gi.conference.sessiontitleRegular Research Papers

Dateien

Originalbündel
1 - 1 von 1
Lade...
Vorschaubild
Name:
biosig2021_proceedings_13.pdf
Größe:
353.08 KB
Format:
Adobe Portable Document Format