Bisztray, TamásGruschka, NilsBourlai, ThirimachosFritsch, LotharBrömme, ArslanBusch, ChristophDamer, NaserDantcheva, AntitzaGomez-Barrero, MartaRaja, KiranRathgeb, ChristianSequeira, AnaUhl, Andreas2021-10-042021-10-042021978-3-88579-709-8https://dl.gi.de/handle/20.500.12116/37475Technological advancements allow biometric applications to be more omnipresent than in any other time before. This paper argues that in the current EU data protection regulation, classification applications using biometric data receive less protection compared to biometric recognition. We analyse preconditions in the regulatory language and explore how this has the potential to be the source of unique privacy risks for processing operations classifying individuals based on soft traits like emotions. This can have high impact on personal freedoms and human rights and, therefore, should be subject to data protection impact assessment.enbiometric datadata protection impact assessmentGDPRtaxonomyprofilingprivacydigital identityEmerging biometric modalities and their use: Loopholes in the terminology of the GDPR and resulting privacy risksText/Conference Paper1617-5468