no code implementations • 20 Sep 2021 • Conor Newton, Ayalvadi Ganesh, Henry W. J. Reeve
In fact, our regret guarantee matches the asymptotically optimal rate achievable in the full communication setting.
no code implementations • 2 Sep 2021 • Henry W. J. Reeve, Timothy I. Cannings, Richard J. Samworth
We formulate the problem as one of constrained optimisation, where we seek a low-complexity, data-dependent selection set on which, with a guaranteed probability, the regression function is uniformly at least as large as the threshold; subject to this constraint, we would like the region to contain as much mass under the marginal feature distribution as possible.
no code implementations • 8 Jun 2021 • Henry W. J. Reeve, Timothy I. Cannings, Richard J. Samworth
In transfer learning, we wish to make inference about a target population when we have access to data both from the distribution itself, and from a different but related source distribution.
no code implementations • 2 Jun 2021 • Henry W. J. Reeve, Ata Kaban
For each of these tasks, our strategy is to develop a tight upper bound on the compressibility function, and by doing so we discover distributional conditions of geometric nature under which the compressive algorithm attains minimax-optimal rates up to at most poly-logarithmic factors.
no code implementations • 11 Jun 2019 • Henry W. J. Reeve, Ata Kaban
We consider classification in the presence of class-dependent asymmetric label noise with unknown noise probabilities.
no code implementations • 14 Feb 2019 • Henry W. J. Reeve, Ata Kaban
In order to obtain finite sample rates, previous approaches to classification with unknown class-conditional label noise have required that the regression function is close to its extrema on sets of large measure.
no code implementations • 23 Nov 2015 • Henry W. J. Reeve, Gavin Brown
We introduce the concept of a Modular Autoencoder (MAE), capable of learning a set of diverse but complementary representations from unlabelled data, that can later be used for supervised tasks.