Privacy, Utility, Effort, Transparency and Fairness: Identifying and Swaying Trade-offs in Privacy Preserving Machine Learning through Hybrid Methods
dc.contributor.author | Eleks, Marian | |
dc.contributor.author | Ihler Jakob | |
dc.contributor.author | Rebstadt, Jonas | |
dc.contributor.author | Kortum-Landwehr, Henrik | |
dc.contributor.author | Thomas, Oliver | |
dc.contributor.editor | Klein, Maike | |
dc.contributor.editor | Krupka, Daniel | |
dc.contributor.editor | Winter, Cornelia | |
dc.contributor.editor | Gergeleit, Martin | |
dc.contributor.editor | Martin, Ludger | |
dc.date.accessioned | 2024-10-21T18:24:23Z | |
dc.date.available | 2024-10-21T18:24:23Z | |
dc.date.issued | 2024 | |
dc.description.abstract | As Artificial Intelligence (AI) permeates most economic sectors, the discipline Privacy Preserving Machine Learning (PPML) gains increasing importance as a way to ensure appropriate handling of sensitive data in the machine learning process. Although PPML-methods stand to provide privacy protection in AI use cases, each one comes with a trade-off. Practitioners applying PPML-methods increasingly request an overview of the types and impacts of these trade-offs. To aid this gap in knowledge, this article applies design science research to collect trade-off dimensions and method impacts in an extensive literature review. It then evaluates the specific trade-offs with a focus group of experts and finally constructs an overview over PPML-methods and method combinations’ impact. The final trade-off dimensions are privacy, utility, effort, transparency, and fairness. Seven PPML-methods and their combinations are evaluated according to their impact in these dimensions, resulting in a vast collection of design knowledge and identified research gaps. | en |
dc.identifier.doi | 10.18420/inf2024_02 | |
dc.identifier.isbn | 978-3-88579-746-3 | |
dc.identifier.pissn | 1617-5468 | |
dc.identifier.uri | https://dl.gi.de/handle/20.500.12116/45177 | |
dc.language.iso | en | |
dc.publisher | Gesellschaft für Informatik e.V. | |
dc.relation.ispartof | INFORMATIK 2024 | |
dc.relation.ispartofseries | Lecture Notes in Informatics (LNI) - Proceedings, Volume P-352 | |
dc.subject | Privacy Preserving Machine Learning | |
dc.subject | Trade-off | |
dc.subject | Hybrid Methods | |
dc.subject | Design Science | |
dc.title | Privacy, Utility, Effort, Transparency and Fairness: Identifying and Swaying Trade-offs in Privacy Preserving Machine Learning through Hybrid Methods | en |
dc.type | Text/Conference Paper | |
gi.citation.endPage | 57 | |
gi.citation.publisherPlace | Bonn | |
gi.citation.startPage | 43 | |
gi.conference.date | 24.-26. September 2024 | |
gi.conference.location | Wiesbaden | |
gi.conference.sessiontitle | 5. Privacy & Security at Large Workshop |
Dateien
Originalbündel
1 - 1 von 1
Lade...
- Name:
- Eleks_et_al_Privacy_Utility_Effort.pdf
- Größe:
- 423.81 KB
- Format:
- Adobe Portable Document Format