Search Results for author: Iosif Lytras

Found 3 papers, 0 papers with code

Taming under isoperimetry

no code implementations15 Nov 2023 Iosif Lytras, Sotirios Sabanis

In this article we propose a novel taming Langevin-based scheme called $\mathbf{sTULA}$ to sample from distributions with superlinearly growing log-gradient which also satisfy a Log-Sobolev inequality.

Kinetic Langevin MCMC Sampling Without Gradient Lipschitz Continuity -- the Strongly Convex Case

no code implementations19 Jan 2023 Tim Johnston, Iosif Lytras, Sotirios Sabanis

In this article we consider sampling from log concave distributions in Hamiltonian setting, without assuming that the objective gradient is globally Lipschitz.

Taming neural networks with TUSLA: Non-convex learning via adaptive stochastic gradient Langevin algorithms

no code implementations25 Jun 2020 Attila Lovas, Iosif Lytras, Miklós Rásonyi, Sotirios Sabanis

We offer a new learning algorithm based on an appropriately constructed variant of the popular stochastic gradient Langevin dynamics (SGLD), which is called tamed unadjusted stochastic Langevin algorithm (TUSLA).

Cannot find the paper you are looking for? You can Submit a new open access paper.