Search Results for author: Tomás González

Found 2 papers, 0 papers with code

Mirror Descent Algorithms with Nearly Dimension-Independent Rates for Differentially-Private Stochastic Saddle-Point Problems

no code implementations5 Mar 2024 Tomás González, Cristóbal Guzmán, Courtney Paquette

For convex-concave and first-order-smooth stochastic objectives, our algorithms attain a rate of $\sqrt{\log(d)/n} + (\log(d)^{3/2}/[n\varepsilon])^{1/3}$, where $d$ is the dimension of the problem and $n$ the dataset size.

LEMMA

Faster Rates of Convergence to Stationary Points in Differentially Private Optimization

no code implementations2 Jun 2022 Raman Arora, Raef Bassily, Tomás González, Cristóbal Guzmán, Michael Menart, Enayat Ullah

We provide a new efficient algorithm that finds an $\tilde{O}\big(\big[\frac{\sqrt{d}}{n\varepsilon}\big]^{2/3}\big)$-stationary point in the finite-sum setting, where $n$ is the number of samples.

Stochastic Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.