Search Results for author: Mi Jung Park

Found 1 papers, 1 papers with code

Differentially Private Kernel Inducing Points using features from ScatterNets (DP-KIP-ScatterNet) for Privacy Preserving Data Distillation

2 code implementations31 Jan 2023 Margarita Vinaroz, Mi Jung Park

Data distillation aims to generate a small data set that closely mimics the performance of a given learning algorithm on the original data set.

Privacy Preserving regression

Cannot find the paper you are looking for? You can Submit a new open access paper.