Transferability Analysis of an Adversarial Attack on Gender Classification to Face Recognition
dc.contributor.author | Rezgui, Zohra | |
dc.contributor.author | Bassit, Amina | |
dc.contributor.editor | Brömme, Arslan | |
dc.contributor.editor | Busch, Christoph | |
dc.contributor.editor | Damer, Naser | |
dc.contributor.editor | Dantcheva, Antitza | |
dc.contributor.editor | Gomez-Barrero, Marta | |
dc.contributor.editor | Raja, Kiran | |
dc.contributor.editor | Rathgeb, Christian | |
dc.contributor.editor | Sequeira, Ana | |
dc.contributor.editor | Uhl, Andreas | |
dc.date.accessioned | 2021-10-04T08:43:43Z | |
dc.date.available | 2021-10-04T08:43:43Z | |
dc.date.issued | 2021 | |
dc.description.abstract | Modern biometric systems establish their decision based on the outcome of machine learning (ML) classifiers trained to make accurate predictions. Such classifiers are vulnerable to diverse adversarial attacks, altering the classifiers' predictions by adding a crafted perturbation. According to ML literature, those attacks are transferable among models that perform the same task. However, models performing different tasks, but sharing the same input space and the same model architecture, were never included in transferability scenarios. In this paper, we analyze this phenomenon for the special case of VGG16-based biometric classifiers. Concretely, we study the effect of the white-box FGSM attack, on a gender classifier and compare several defense methods as countermeasure. Then, in a black-box manner, we attack a pre-trained face recognition classifier using adversarial images generated by the FGSM. Our experiments show that this attack is transferable from a gender classifier to a face recognition classifier where both were independently trained. | en |
dc.identifier.isbn | 978-3-88579-709-8 | |
dc.identifier.pissn | 1617-5468 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/37447 | |
dc.language.iso | en | |
dc.publisher | Gesellschaft für Informatik e.V. | |
dc.relation.ispartof | BIOSIG 2021 - Proceedings of the 20th International Conference of the Biometrics Special Interest Group | |
dc.relation.ispartofseries | Lecture Notes in Informatics (LNI) - Proceedings, Volume P-315 | |
dc.subject | Transferability | |
dc.subject | adversarial attacks | |
dc.subject | gender classification | |
dc.subject | face recognition | |
dc.title | Transferability Analysis of an Adversarial Attack on Gender Classification to Face Recognition | en |
dc.type | Text/Conference Paper | |
gi.citation.endPage | 136 | |
gi.citation.publisherPlace | Bonn | |
gi.citation.startPage | 125 | |
gi.conference.date | 15.-17. September 2021 | |
gi.conference.location | International Digital Conference | |
gi.conference.sessiontitle | Regular Research Papers |
Dateien
Originalbündel
1 - 1 von 1
Lade...
- Name:
- biosig2021_proceedings_13.pdf
- Größe:
- 353.08 KB
- Format:
- Adobe Portable Document Format