Search Results for author: Junzhao Yang

Found 2 papers, 0 papers with code

The Cost of Parallelizing Boosting

no code implementations23 Feb 2024 Xin Lyu, Hongxun Wu, Junzhao Yang

Karbasi and Larsen showed that "significant" parallelization must incur exponential blow-up: Any boosting algorithm either interacts with the weak learner for $\Omega(1 / \gamma)$ rounds or incurs an $\exp(d / \gamma)$ blow-up in the complexity of training, where $d$ is the VC dimension of the hypothesis class.

Tight Time-Space Lower Bounds for Constant-Pass Learning

no code implementations12 Oct 2023 Xin Lyu, Avishay Tal, Hongxun Wu, Junzhao Yang

In this work, for any constant $q$, we prove tight memory-sample lower bounds for any parity learning algorithm that makes $q$ passes over the stream of samples.

Cannot find the paper you are looking for? You can Submit a new open access paper.