Search Results for author: Lindon Roberts

Found 10 papers, 4 papers with code

Non-Uniform Smoothness for Gradient Descent

no code implementations15 Nov 2023 Albert S. Berahas, Lindon Roberts, Fred Roosta

The analysis of gradient descent-type methods typically relies on the Lipschitz continuity of the objective gradient.

An adaptively inexact first-order method for bilevel optimization with application to hyperparameter learning

no code implementations19 Aug 2023 Mohammad Sadegh Salehi, Subhadip Mukherjee, Lindon Roberts, Matthias J. Ehrhardt

In this work, we propose an algorithm with backtracking line search that only relies on inexact function evaluations and hypergradients and show convergence to a stationary point.

Bilevel Optimization Denoising

Analyzing Inexact Hypergradients for Bilevel Learning

no code implementations11 Jan 2023 Matthias J. Ehrhardt, Lindon Roberts

Estimating hyperparameters has been a long-standing problem in machine learning.

Bilevel Optimization

Optimizing illumination patterns for classical ghost imaging

no code implementations7 Nov 2022 Andrew M. Kingston, Lindon Roberts, Alaleh Aminzadeh, Daniele Pelliccia, Imants D. Svalbe, David M. Paganin

Classical ghost imaging is a new paradigm in imaging where the image of an object is not measured directly with a pixelated detector.

Object

A simplified convergence theory for Byzantine resilient stochastic gradient descent

no code implementations25 Aug 2022 Lindon Roberts, Edward Smyth

In distributed learning, a central server trains a model according to updates provided by nodes holding local data samples.

Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization

1 code implementation6 Nov 2020 Matthias J. Ehrhardt, Lindon Roberts

Here, we apply a recent dynamic accuracy derivative-free optimization method to hyperparameter tuning, which allows inexact evaluations of the learning problem while retaining convergence guarantees.

Scalable Derivative-Free Optimization for Nonlinear Least-Squares Problems

no code implementations26 Jul 2020 Coralia Cartis, Tyler Ferguson, Lindon Roberts

Derivative-free - or zeroth-order - optimization (DFO) has gained recent attention for its ability to solve problems in a variety of application areas, including machine learning, particularly involving objectives which are stochastic and/or expensive to compute.

Dimensionality Reduction

Inexact Derivative-Free Optimization for Bilevel Learning

1 code implementation23 Jun 2020 Matthias J. Ehrhardt, Lindon Roberts

A drawback of these techniques is that they are dependent on a number of parameters which have to be set by the user.

Bilevel Optimization Denoising

Escaping local minima with derivative-free methods: a numerical investigation

1 code implementation29 Dec 2018 Coralia Cartis, Lindon Roberts, Oliver Sheridan-Methven

We apply a state-of-the-art, local derivative-free solver, Py-BOBYQA, to global optimization problems, and propose an algorithmic improvement that is beneficial in this context.

Optimization and Control

Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers

3 code implementations31 Mar 2018 Coralia Cartis, Jan Fiala, Benjamin Marteau, Lindon Roberts

Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation.

Optimization and Control

Cannot find the paper you are looking for? You can Submit a new open access paper.