Czarnocki, JanBrömme, ArslanBusch, ChristophDamer, NaserDantcheva, AntitzaGomez-Barrero, MartaRaja, KiranRathgeb, ChristianSequeira, AnaUhl, Andreas2021-10-042021-10-042021978-3-88579-709-8https://dl.gi.de/handle/20.500.12116/37452The paper explains how the definition of biometric data copied from the GDPR may hamper the regulation of emotion recognition—as defined in the proposed AI Act. A replicated definition of biometric data is suitable for biometric systems, but not emotion recognition technologies. It is because, under the proposed AI Act, an emotion recognition system is understood as such if it processes biometric data—as defined in the GDPR. But the definition from the GDPR does not encompass all biometric data, which are technically biometric data and are processed in the emotion recognition systems. Also, in the proposed AI Act the definition of emotion recognition does not recognize emotion recognition systems not relying on biometric data processing. That is why the obligation in the proposed AI Act for users to inform natural persons about their exposure to the emotion recognition system is unapplicable in the majority of cases. The flawed definition may also put at risk the proposed AI Act-based assessment of whether AI systems should be prohibited. Therefore, a new definition of emotion recognition and biometric data is needed.enbiometric dataemotion recognitionAI ActAIdata protectionprivacybiometricsWill new definitions of emotion recognition and biometric data hamper the objectives of the proposed AI Act?Text/Conference Paper1617-5468