1 code implementation • 16 Nov 2023 • Joseph Early, Gavin KC Cheung, Kurt Cutajar, Hanting Xie, Jas Kandola, Niall Twomey
Conventional Time Series Classification (TSC) methods are often black boxes that obscure inherent interpretation of their decision-making processes.
no code implementations • 24 Aug 2023 • Philipp Renz, Kurt Cutajar, Niall Twomey, Gavin K. C. Cheung, Hanting Xie
Low-count time series describe sparse or intermittent events, which are prevalent in large-scale online platforms that capture and monitor diverse data types.
1 code implementation • 18 Mar 2019 • Kurt Cutajar, Mark Pullin, Andreas Damianou, Neil Lawrence, Javier González
Multi-fidelity methods are prominently used when cheaply-obtained, but possibly biased and noisy, observations must be effectively combined with limited or expensive true data in order to construct reliable models.
1 code implementation • 24 Apr 2017 • Jack Fitzsimons, Diego Granziol, Kurt Cutajar, Michael Osborne, Maurizio Filippone, Stephen Roberts
The scalable calculation of matrix determinants has been a bottleneck to the widespread application of many machine learning methods such as determinantal point processes, Gaussian processes, generalised Markov random fields, graph models and many others.
no code implementations • 5 Apr 2017 • Jack Fitzsimons, Kurt Cutajar, Michael Osborne, Stephen Roberts, Maurizio Filippone
The log-determinant of a kernel matrix appears in a variety of machine learning problems, ranging from determinantal point processes and generalized Markov random fields, through to the training of Gaussian processes.
no code implementations • 18 Oct 2016 • Karl Krauth, Edwin V. Bonilla, Kurt Cutajar, Maurizio Filippone
We investigate the capabilities and limitations of Gaussian process models by jointly exploring three complementary directions: (i) scalable and statistically efficient inference; (ii) flexible kernels; and (iii) objective functions for hyperparameter learning alternative to the marginal likelihood.
1 code implementation • ICML 2017 • Kurt Cutajar, Edwin V. Bonilla, Pietro Michiardi, Maurizio Filippone
The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty.
1 code implementation • 22 Feb 2016 • Kurt Cutajar, Michael A. Osborne, John P. Cunningham, Maurizio Filippone
Preconditioning is a common approach to alleviating this issue.