Logo des Repositoriums
 
Konferenzbeitrag

Compact Models for Periocular Verification Through Knowledge Distillation

Vorschaubild nicht verfügbar

Volltext URI

Dokumententyp

Text/Conference Paper

Zusatzinformation

Datum

2020

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

Gesellschaft für Informatik e.V.

Zusammenfassung

Despite the wide use of deep neural network for periocular verification, achieving smaller deep learning models with high performance that can be deployed on low computational powered devices remains a challenge. In term of computation cost, we present in this paper a lightweight deep learning model with only 1.1m of trainable parameters, DenseNet-20, based on DenseNet architecture. Further, we present an approach to enhance the verification performance of DenseNet-20 via knowledge distillation. With the experiments on VISPI dataset captured with two different smartphones, iPhone and Nokia, we show that introducing knowledge distillation to DenseNet-20 training phase outperforms the same model trained without knowledge distillation where the Equal Error Rate (EER) reduces from 8.36% to 4.56% EER on iPhone data, from 5.33% to 4.64% EER on Nokia data, and from 20.98% to 15.54% EER on cross-smartphone data.

Beschreibung

Boutros, Fadi; Damer, Naser; Fang, Meiling; Raja, Kiran; Kirchbuchner, Florian; Kuijper, Arjan (2020): Compact Models for Periocular Verification Through Knowledge Distillation. BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group. Bonn: Gesellschaft für Informatik e.V.. PISSN: 1617-5468. ISBN: 978-3-88579-700-5. pp. 291-298. Further Conference Contributions. International Digital Conference. 16.-18. September 2020

Zitierform

DOI

Tags