Compact Models for Periocular Verification Through Knowledge Distillation
dc.contributor.author | Boutros, Fadi | |
dc.contributor.author | Damer, Naser | |
dc.contributor.author | Fang, Meiling | |
dc.contributor.author | Raja, Kiran | |
dc.contributor.author | Kirchbuchner, Florian | |
dc.contributor.author | Kuijper, Arjan | |
dc.contributor.editor | Brömme, Arslan | |
dc.contributor.editor | Busch, Christoph | |
dc.contributor.editor | Dantcheva, Antitza | |
dc.contributor.editor | Raja, Kiran | |
dc.contributor.editor | Rathgeb, Christian | |
dc.contributor.editor | Uhl, Andreas | |
dc.date.accessioned | 2020-09-16T08:25:48Z | |
dc.date.available | 2020-09-16T08:25:48Z | |
dc.date.issued | 2020 | |
dc.description.abstract | Despite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data. | en |
dc.identifier.isbn | 978-3-88579-700-5 | |
dc.identifier.pissn | 1617-5468 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/34340 | |
dc.language.iso | en | |
dc.publisher | Gesellschaft für Informatik e.V. | |
dc.relation.ispartof | BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group | |
dc.relation.ispartofseries | Lecture Notes in Informatics (LNI) - Proceedings, Volume P-306 | |
dc.subject | Periocular recognition | |
dc.subject | Smartphone biometric verification | |
dc.subject | Knowledge distillation. | |
dc.title | Compact Models for Periocular Verification Through Knowledge Distillation | en |
dc.type | Text/Conference Paper | |
gi.citation.endPage | 298 | |
gi.citation.publisherPlace | Bonn | |
gi.citation.startPage | 291 | |
gi.conference.date | 16.-18. September 2020 | |
gi.conference.location | International Digital Conference | |
gi.conference.sessiontitle | Further Conference Contributions |
Dateien
Originalbündel
1 - 1 von 1