no code implementations • 20 Jun 2021 • Matteo Sordello, Zhiqi Bu, Jinshuo Dong
We then analyze the online setting and provide a faster decaying scheme for the magnitude of the injected noise that also guarantees the convergence of privacy loss.
no code implementations • 18 Oct 2019 • Matteo Sordello, Niccolò Dalmasso, Hangfeng He, Weijie Su
This paper proposes SplitSGD, a new dynamic learning rate schedule for stochastic optimization.