Logo des Repositoriums
 

Compact Models for Periocular Verification Through Knowledge Distillation

dc.contributor.authorBoutros, Fadi
dc.contributor.authorDamer, Naser
dc.contributor.authorFang, Meiling
dc.contributor.authorRaja, Kiran
dc.contributor.authorKirchbuchner, Florian
dc.contributor.authorKuijper, Arjan
dc.contributor.editorBrömme, Arslan
dc.contributor.editorBusch, Christoph
dc.contributor.editorDantcheva, Antitza
dc.contributor.editorRaja, Kiran
dc.contributor.editorRathgeb, Christian
dc.contributor.editorUhl, Andreas
dc.date.accessioned2020-09-16T08:25:48Z
dc.date.available2020-09-16T08:25:48Z
dc.date.issued2020
dc.description.abstractDespite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data.en
dc.identifier.isbn978-3-88579-700-5
dc.identifier.pissn1617-5468
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/34340
dc.language.isoen
dc.publisherGesellschaft für Informatik e.V.
dc.relation.ispartofBIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group
dc.relation.ispartofseriesLecture Notes in Informatics (LNI) - Proceedings, Volume P-306
dc.subjectPeriocular recognition
dc.subjectSmartphone biometric verification
dc.subjectKnowledge distillation.
dc.titleCompact Models for Periocular Verification Through Knowledge Distillationen
dc.typeText/Conference Paper
gi.citation.endPage298
gi.citation.publisherPlaceBonn
gi.citation.startPage291
gi.conference.date16.-18. September 2020
gi.conference.locationInternational Digital Conference
gi.conference.sessiontitleFurther Conference Contributions

Dateien

Originalbündel
1 - 1 von 1
Vorschaubild nicht verfügbar
Name:
BIOSIG_2020_paper_47.pdf
Größe:
2.2 MB
Format:
Adobe Portable Document Format