DIFFERENTIAL PRIVACY GUARANTEES FOR ANALYTICS AND MACHINE LEARNING ON GRAPHS: A SURVEY OF RESULTS

Tamara T. Mueller, Dmitrii Usynin, Johannes C. Paetzold, Rickmer Braren, Daniel Rueckert, Georgios Kaissis

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We study differential privacy (DP) in the context of graph-structured data and discuss its formulations and applications to the publication of graphs and their associated statistics, graph generation methods, and machine learning on graph-based data, including graph neural networks (GNNs). Interpreting DP guarantees in the context of graphstructured data can be challenging, as individual data points are interconnected (often non-linearly or sparsely). This differentiates graph databases from tabular databases, which are usually used in DP, and complicates related concepts like the derivation of per-sample gradients in GNNs. The problem is exacerbated by an absence of a single, well-established formulation of DP in graph settings. A lack of prior systematisation work motivated us to study graph-based learning from a privacy perspective. In this work, we systematise different formulations of DP on graphs, and discuss challenges and promising applications, including the GNN domain. We compare and separate works into methods that privately estimate graph data (either by statistical analysis or using GNNs), and methods that aim at generating new graph data. We conclude our work with a discussion of open questions and potential directions for further research in this area.

Original languageEnglish
JournalJournal of Privacy and Confidentiality
Volume14
Issue number1
DOIs
StatePublished - 2024

Keywords

  • differential privacy
  • graph analytics
  • graph neural networks
  • graph-structured data

Fingerprint

Dive into the research topics of 'DIFFERENTIAL PRIVACY GUARANTEES FOR ANALYTICS AND MACHINE LEARNING ON GRAPHS: A SURVEY OF RESULTS'. Together they form a unique fingerprint.

Cite this