no code implementations • ICML 2020 • Blair Bilodeau, Dylan Foster, Daniel Roy
We study the classical problem of forecasting under logarithmic loss while competing against an arbitrary class of experts.
no code implementations • ICML 2020 • Jeffrey Negrea, Daniel Roy, Gintare Karolina Dziugaite
At the same time, we bound the risk of h^ in terms of a surrogate that is constructed by conditioning and shown to belong to a nonrandom class with uniformly small generalization error.
1 code implementation • NeurIPS 2020 • Fartash Faghri, Iman Tabrizian, Ilia Markov, Dan Alistarh, Daniel Roy, Ali Ramezani-Kebrya
Many communication-efficient variants of SGD use gradient quantization schemes.