1 code implementation • 5 Jun 2023 • Nicolo Colombo
We address the problem of making Conformal Prediction (CP) intervals locally adaptive.
no code implementations • 7 Jul 2021 • Nicolo Colombo, Yang Gao
We propose a new gradient-based approach for extracting sub-architectures from a given large model.
1 code implementation • 7 May 2021 • Yang Gao, Nicolo Colombo, Wei Wang
Adapting pre-trained neural models to downstream tasks has become the standard practice for obtaining high-quality models.
no code implementations • 11 Sep 2020 • Nicolo Colombo, Yang Gao
To find the optimal weight-agnostic network, we use a novel and computationally efficient method that translates the hard architecture-search problem into a feasible optimization problem. More specifically, we look at the optimal task-specific architectures as the optimal configuration of binary networks with {0, 1}-valued weights, which can be found through an approximate gradient descent strategy.
no code implementations • 14 May 2020 • Nicolo Colombo, Vladimir Vovk
Efficiency criteria for conformal prediction, such as \emph{observed fuzziness} (i. e., the sum of p-values associated with false labels), are commonly used to \emph{evaluate} the performance of given conformal predictors.
no code implementations • 13 Feb 2020 • Nicolo Colombo
Main challenge is that metric constraints (especially positive-definiteness and sub-additivity), are not automatically respected if, for example, the coefficients of the linear combination are allowed to be negative.
no code implementations • 20 Aug 2019 • Nicolo Colombo, Ricardo Silva, Soong M Kang, Arthur Gretton
The inference problem is how information concerning perturbations, with particular covariates such as location and time, can be generalized to predict the effect of novel perturbations.
2 code implementations • NeurIPS 2018 • Yin Cheng Ng, Nicolo Colombo, Ricardo Silva
We propose a data-efficient Gaussian process-based Bayesian approach to the semi-supervised learning problem on graphs.
no code implementations • NeurIPS 2016 • Nicolo Colombo, Nikos Vlassis
Joint matrix triangularization is often used for estimating the joint eigenstructure of a set M of matrices, with applications in signal processing and machine learning.
no code implementations • 2 Jul 2016 • Nicolo Colombo, Nikos Vlassis
The a priori bounds are theoretical inequalities that involve functions of the ground-truth matrices and noise matrices, whereas the a posteriori bounds are given in terms of observable quantities that can be computed from the input matrices.