no code implementations • 16 Feb 2024 • David Buterez, Jon Paul Janet, Dino Oglic, Pietro Lio
Graph neural networks (GNNs) and variations of the message passing algorithm are the predominant means for learning on graphs, largely due to their flexibility, speed, and satisfactory performance.
1 code implementation • 25 Jan 2024 • Talip Ucar, Aubin Ramon, Dino Oglic, Rebecca Croasdale-Wood, Tom Diethe, Pietro Sormanni
We investigate the potential of patent data for improving the antibody humanness prediction using a multi-stage, multi-loss training process.
1 code implementation • 9 Nov 2022 • David Buterez, Jon Paul Janet, Steven J. Kiddle, Dino Oglic, Pietro Liò
We argue that in some problems such as binding affinity prediction where molecules are typically presented in a canonical form it might be possible to relax the constraints on permutation invariance of the hypothesis space and learn a more effective model of the affinity by employing an adaptive readout function.
no code implementations • 16 Oct 2021 • Dino Oglic, Zoran Cvetkovic, Peter Sollich, Steve Renals, Bin Yu
We study the problem of learning robust acoustic models in adverse environments, characterized by a significant mismatch between training and test conditions.
1 code implementation • 23 Jun 2019 • Dino Oglic, Zoran Cvetkovic, Peter Sollich
We investigate the potential of stochastic neural networks for learning effective waveform-based acoustic models.
no code implementations • 6 Sep 2018 • Dino Oglic, Thomas Gärtner
We provide the first mathematically complete derivation of the Nystr\"om method for low-rank approximation of indefinite kernels and propose an efficient method for finding an approximate eigendecomposition of such kernel matrices.
no code implementations • ICML 2018 • Dino Oglic, Thomas Gaertner
We formulate a novel regularized risk minimization problem for learning in reproducing kernel Kre{ı̆}n spaces and show that the strong representer theorem applies to it.
no code implementations • 24 Jun 2018 • Zhu Li, Jean-Francois Ton, Dino Oglic, Dino Sejdinovic
We study both the standard random Fourier features method for which we improve the existing bounds on the number of features required to guarantee the corresponding minimax risk convergence rate of kernel ridge regression, as well as a data-dependent modification which samples features proportional to \emph{ridge leverage scores} and further reduces the required number of features.
no code implementations • ICML 2017 • Dino Oglic, Thomas Gärtner
We investigate, theoretically and empirically, the effectiveness of kernel K-means++ samples as landmarks in the Nyström method for low-rank approximation of kernel matrices.
no code implementations • NeurIPS 2016 • Dino Oglic, Thomas Gärtner
We present an effective method for supervised feature construction.