Auflistung nach Autor:in "Peter, Andreas"
1 - 2 von 2
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragBloom Filter vs Homomorphic Encryption: Which approach protects the biometric data and satisfies ISO/IEC 24745?(BIOSIG 2021 - Proceedings of the 20th International Conference of the Biometrics Special Interest Group, 2021) Bassit, Amina; Hahn, Florian; Zeinstra, Chris; Veldhuis, Raymond; Peter, AndreasBloom filter (BF) and homomorphic encryption (HE) are popular modern techniques used to design biometric template protection (BTP) schemes that aim to protect the sensitive biometric information during storage and the comparison process. However, in practice, many BTP schemes based on BF or HE violate at least one of the privacy requirements of the international standard ISO/IEC 24745: irreversibility, unlinkability and confidentiality. In this paper, we investigate the state-of-the-art BTP schemes based on these two approaches and assess their relative strengths and weaknesses with respect to the three requirements of ISO/IEC 24745. The results of our investigation showed that the choice between BF and HE depends on the setting where the BTP scheme will be deployed and the level of trustworthiness of the parties involved in processing the protected template. As a result, HE enhanced by verifiable computation techniques can satisfy the privacy requirements of ISO/IEC 24745 in a trustless setting.
- KonferenzbeitragPrivacy-preserving verification of clinical research(Sicherheit 2014 – Sicherheit, Schutz und Zuverlässigkeit, 2014) Makri, Eleftheria; Everts, Maarten H.; Hoogh, Sebastiaan De; Peter, Andreas; Akker, Harm Op Den; Hartel, Pieter; Jonker, WillemWe treat the problem of privacy-preserving statistics verification in clinical research. We show that given aggregated results from statistical calculations, we can verify their correctness efficiently, without revealing any of the private inputs used for the calculation. Our construction is based on the primitive of Secure Multi-Party Computation from Shamir's Secret Sharing. Basically, our setting involves three parties: a hospital, which owns the private inputs, a clinical researcher, who lawfully processes the sensitive data to produce an aggregated statistical result, and a third party (usually several verifiers) assigned to verify this result for reliability and transparency reasons. Our solution guarantees that these verifiers only learn about the aggregated results (and what can be inferred from those about the underlying private data) and nothing more. By taking advantage of the particular scenario at hand (where certain intermediate results, e.g., the mean over the dataset, are available in the clear) and utilizing secret sharing primitives, our approach turns out to be practically efficient, which we underpin by performing several experiments on real patient data. Our results show that the privacy-preserving verification of the most commonly used statistical operations in clinical research presents itself as an important use case, where the concept of secure multi-party computation becomes employable in practice.