Search Results for author: Friederike Jungmann

Found 5 papers, 2 papers with code

How Do Input Attributes Impact the Privacy Loss in Differential Privacy?

no code implementations18 Nov 2022 Tamara T. Mueller, Stefan Kolek, Friederike Jungmann, Alexander Ziller, Dmitrii Usynin, Moritz Knolle, Daniel Rueckert, Georgios Kaissis

Differential privacy (DP) is typically formulated as a worst-case privacy guarantee over all individuals in a database.

Exploiting segmentation labels and representation learning to forecast therapy response of PDAC patients

no code implementations8 Nov 2022 Alexander Ziller, Ayhan Can Erdur, Friederike Jungmann, Daniel Rueckert, Rickmer Braren, Georgios Kaissis

The prediction of pancreatic ductal adenocarcinoma therapy response is a clinically challenging and important task in this high-mortality tumour entity.

Representation Learning

A unified interpretation of the Gaussian mechanism for differential privacy through the sensitivity index

no code implementations22 Sep 2021 Georgios Kaissis, Moritz Knolle, Friederike Jungmann, Alexander Ziller, Dmitrii Usynin, Daniel Rueckert

$\psi$ uniquely characterises the GM and its properties by encapsulating its two fundamental quantities: the sensitivity of the query and the magnitude of the noise perturbation.

Partial sensitivity analysis in differential privacy

1 code implementation22 Sep 2021 Tamara T. Mueller, Alexander Ziller, Dmitrii Usynin, Moritz Knolle, Friederike Jungmann, Daniel Rueckert, Georgios Kaissis

However, while techniques such as individual R\'enyi DP (RDP) allow for granular, per-person privacy accounting, few works have investigated the impact of each input feature on the individual's privacy loss.

Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.