Logo des Repositoriums
 
Konferenzbeitrag

From attributes to faces: a conditional generative network for face genera-tion

Lade...
Vorschaubild

Volltext URI

Dokumententyp

Text/Conference Paper

Zusatzinformation

Datum

2018

Zeitschriftentitel

ISSN der Zeitschrift

Bandtitel

Verlag

Köllen Druck+Verlag GmbH

Zusammenfassung

Recent advances in computer vision have aimed at extracting and classifying auxiliary biometric information such as age, gender, as well as health attributes, referred to as soft biometrics or attributes. We here seek to explore the inverse problem, namely face generation based on attribute labels, which is of interest due to related applications in law enforcement and entertainment. Particularly, we propose a method based on deep conditional generative adversarial network (DCGAN), which introduces additional data (e.g., labels) towards determining specific representations of generated images. We present experimental results of the method, trained on the dataset CelebA, and validate these based on two GAN-quality-metrics, as well as based on three face detectors and one commercial off the shelf (COTS) attribute classifier. While these are early results, our findings indicate the method’s ability to generate realistic faces from attribute labels.

Beschreibung

Wang, Yaohui; Dantcheva, Antitza; Bremond, Francois (2018): From attributes to faces: a conditional generative network for face genera-tion. BIOSIG 2018 - Proceedings of the 17th International Conference of the Biometrics Special Interest Group. Bonn: Köllen Druck+Verlag GmbH. PISSN: 1617-5469. ISBN: 978-3-88579-676-4. Darmstadt. 26.-28. September 2018

Zitierform

DOI

Tags