Tuning the hyperparameters in the differentially private stochastic gradient descent (DPSGD) is a fundamental challenge.
In this paper, we introduce theoretically-motivated measures to quantify information leakages in both attack-dependent and attack-independent manners.
It is known that deep neural networks, trained for the classification of non-sensitive target attributes, can reveal sensitive attributes of their input data through internal representations extracted by the classifier.
While rich medical datasets are hosted in hospitals distributed across the world, concerns on patients' privacy is a barrier against using such data to train deep neural networks (DNNs) for medical diagnostics.
Training deep neural networks via federated learning allows clients to share, instead of the original data, only the model trained on their data.
In this paper we show that the data plane of commodity programmable (Network Interface Cards) NICs can run neural network inference tasks required by packet monitoring applications, with low overhead.
We introduce a dimension-adaptive pooling (DAP) layer that makes DNNs flexible and more robust to changes in sensor availability and in sampling rate.
Sensitive inferences and user re-identification are major threats to privacy when raw sensor data from wearable or portable devices are shared with cloud-assisted applications.
Motion sensors such as accelerometers and gyroscopes measure the instant acceleration and rotation of a device, in three dimensions.
Results show that the proposed framework maintains the usefulness of the transformed data for activity recognition, with an average loss of only around three percentage points, while reducing the possibility of gender classification to around 50\%, the target random guess, from more than 90\% when using raw sensor data.
Though access to the sensory data is critical to the success of many beneficial applications such as health monitoring or activity recognition, a wide range of potentially sensitive information about the individuals can also be discovered through access to sensory data and this cannot easily be protected using traditional privacy approaches.