Search Results for author: Xin Lyu

Found 7 papers, 0 papers with code

Lower Bounds for Differential Privacy Under Continual Observation and Online Threshold Queries

no code implementations28 Feb 2024 Edith Cohen, Xin Lyu, Jelani Nelson, Tamás Sarlós, Uri Stemmer

One of the most basic problems for studying the "price of privacy over time" is the so called private counter problem, introduced by Dwork et al. (2010) and Chan et al. (2010).

The Cost of Parallelizing Boosting

no code implementations23 Feb 2024 Xin Lyu, Hongxun Wu, Junzhao Yang

Karbasi and Larsen showed that "significant" parallelization must incur exponential blow-up: Any boosting algorithm either interacts with the weak learner for $\Omega(1 / \gamma)$ rounds or incurs an $\exp(d / \gamma)$ blow-up in the complexity of training, where $d$ is the VC dimension of the hypothesis class.

Hot PATE: Private Aggregation of Distributions for Diverse Task

no code implementations4 Dec 2023 Edith Cohen, Xin Lyu, Jelani Nelson, Tamas Sarlos, Uri Stemmer

Until now, PATE has primarily been explored with classification-like tasks, where each example possesses a ground-truth label, and knowledge is transferred to the student by labeling public examples.

Privacy Preserving valid

Tight Time-Space Lower Bounds for Constant-Pass Learning

no code implementations12 Oct 2023 Xin Lyu, Avishay Tal, Hongxun Wu, Junzhao Yang

In this work, for any constant $q$, we prove tight memory-sample lower bounds for any parity learning algorithm that makes $q$ passes over the stream of samples.

On the Robustness of CountSketch to Adaptive Inputs

no code implementations28 Feb 2022 Edith Cohen, Xin Lyu, Jelani Nelson, Tamás Sarlós, Moshe Shechner, Uri Stemmer

CountSketch is a popular dimensionality reduction technique that maps vectors to a lower dimension using randomized linear measurements.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.