Logo des Repositoriums
 

Towards a Differential Privacy Theory for Edge-Labeled Directed Graphs

dc.contributor.authorReuben, Jenni
dc.contributor.editorLangweg, Hanno
dc.contributor.editorMeier, Michael
dc.contributor.editorWitt, Bernhard C.
dc.contributor.editorReinhardt, Delphine
dc.date.accessioned2018-03-22T12:40:42Z
dc.date.available2018-03-22T12:40:42Z
dc.date.issued2018
dc.description.abstractIncreasingly, more and more information is represented as graphs such as social network data, financial transactions and semantic assertions in Semantic Web context. Mining such data about people for useful insights has enormous social and commercial benefits. However, the privacy of the individuals in datasets is a major concern. Hence, the challenge is to enable analyses over a dataset while preserving the privacy of the individuals in the dataset. Differential privacy is a privacy model that offers a rigorous definition of privacy, which says that from the released results of an analysis it is '\emph{difficult}' to determine if an individual contributes to the results or not. The differential privacy model is extensively studied in the context of relational databases. Nevertheless, there has been growing interest in the adaptation of differential privacy to graph data. Previous research in applying differential privacy model to graphs focuses on unlabeled graphs. However, in many applications graphs consist of labeled edges, and the analyses can be more expressive, which now takes into account the labels. Thus, it would be of interest to study the adaptation of differential privacy to edge-labeled directed graphs. In this paper, we present our foundational work towards that aim. First we present three variant notions of an individual's information being/not being in the analyzed graph, which is the basis for formalizing the differential privacy guarantee. Next, we present our plan to study particular graph statistics using the differential privacy model, given the choice of the notion that represent the individual's information being/not being in the analyzed graph.en
dc.identifier.doi10.18420/sicherheit2018_24
dc.identifier.isbn978-3-88579-675-6
dc.identifier.pissn1617-5468
dc.identifier.urihttps://dl.gi.de/handle/20.500.12116/16290
dc.language.isoen
dc.publisherGesellschaft für Informatik e.V.
dc.relation.ispartofSICHERHEIT 2018
dc.relation.ispartofseriesLecture Notes in Informatics (LNI) - Proceedings, Volume P-281
dc.subjectDifferential privacy
dc.subjectgraphs
dc.subjectlabels
dc.subjectanalyze
dc.subjectutility
dc.titleTowards a Differential Privacy Theory for Edge-Labeled Directed Graphsen
dc.typeText/Conference Paper
gi.citation.endPage278
gi.citation.publisherPlaceBonn
gi.citation.startPage273
gi.conference.date24. April 2018
gi.conference.locationKonstanz, Germany
gi.conference.sessiontitleDoktorandenforum

Dateien

Originalbündel
1 - 1 von 1
Lade...
Vorschaubild
Name:
sicherheit2018-24.pdf
Größe:
254 KB
Format:
Adobe Portable Document Format