1 code implementation • ICLR 2022 • Ke Alexander Wang, Niladri S. Chatterji, Saminul Haque, Tatsunori Hashimoto
As a remedy, we show that polynomially-tailed losses restore the effects of importance reweighting in correcting distribution shift in overparameterized models.
1 code implementation • NeurIPS 2019 • Qiyang Li, Saminul Haque, Cem Anil, James Lucas, Roger Grosse, Jörn-Henrik Jacobsen
Our BCOP parameterization allows us to train large convolutional networks with provable Lipschitz bounds.