1 code implementation • 3 Nov 2020 • Juan Maroñas, Oliver Hamelijnck, Jeremias Knoblauch, Theodoros Damoulas
Gaussian Processes (GPs) can be used as flexible, non-parametric function priors.
1 code implementation • 22 Mar 2020 • Juan Maroñas, Daniel Ramos, Roberto Paredes
Data Augmentation (DA) strategies have been proposed to regularize these models, being Mixup one of the most popular due to its ability to improve the accuracy, the uncertainty quantification and the calibration of DNN.
2 code implementations • 3 Sep 2023 • Zhidi Lin, Juan Maroñas, Ying Li, Feng Yin, Sergios Theodoridis
The Gaussian process state-space model (GPSSM) has attracted extensive attention for modeling complex nonlinear dynamical systems.
1 code implementation • 21 Jan 2023 • Zhid Lin, Feng Yin, Juan Maroñas
The Gaussian process state-space model (GPSSM) has garnered considerable attention over the past decade.
1 code implementation • 23 Aug 2019 • Juan Maroñas, Roberto Paredes, Daniel Ramos
Deep Neural Networks (DNNs) have achieved state-of-the-art accuracy performance in many tasks.
no code implementations • 21 Mar 2019 • Juan Maroñas, Roberto Paredes, Daniel Ramos
The goal of this paper is to deal with a data scarcity scenario where deep learning techniques use to fail.
no code implementations • 18 Sep 2019 • Daniel Ramos, Juan Maroñas, Alicia Lozano-Diez
This paper explores several strategies for Forensic Voice Comparison (FVC), aimed at improving the performance of the LRs when using generative Gaussian score-to-LR models.
no code implementations • 27 Sep 2018 • Juan Maroñas, Roberto Paredes, Daniel Ramos
We apply Bayesian Neural Networks to improve calibration of state-of-the-art deep neural networks.
no code implementations • 30 May 2022 • Juan Maroñas, Daniel Hernández-Lobato
ETGPs exploit the recently proposed Transformed Gaussian Process (TGP), a stochastic process specified by transforming a Gaussian Process using an invertible transformation.
no code implementations • 31 Jul 2022 • Sergio A. Balanya, Juan Maroñas, Daniel Ramos
We show that when there is plenty of data complex models like neural networks yield better performance, but are prone to fail when the amount of data is limited, a common situation in certain post-hoc calibration applications like medical diagnosis.
no code implementations • 27 Oct 2023 • Francisco Javier Sáez-Maldonado, Juan Maroñas, Daniel Hernández-Lobato
In this work, we propose a generalization of TGPs named Deep Transformed Gaussian Processes (DTGPs), which follows the trend of concatenating layers of stochastic processes.