Search Results for author: Roberto Paredes

Found 7 papers, 3 papers with code

DaSeGAN: Domain Adaptation for Segmentation Tasks via Generative Adversarial Networks

no code implementations29 Sep 2021 Mario Parreño Lara, Roberto Paredes, Alberto Albiol

Domain adaptation techniques aim to fill this gap by generating mappings between image domains when unlabeled data from the new target domain is available.

Domain Generalization Image Segmentation +3

On Calibration of Mixup Training for Deep Neural Networks

1 code implementation22 Mar 2020 Juan Maroñas, Daniel Ramos, Roberto Paredes

Data Augmentation (DA) strategies have been proposed to regularize these models, being Mixup one of the most popular due to its ability to improve the accuracy, the uncertainty quantification and the calibration of DNN.

Data Augmentation Uncertainty Quantification

Calibration of Deep Probabilistic Models with Decoupled Bayesian Neural Networks

1 code implementation23 Aug 2019 Juan Maroñas, Roberto Paredes, Daniel Ramos

Deep Neural Networks (DNNs) have achieved state-of-the-art accuracy performance in many tasks.

Image Classification

Generative Models For Deep Learning with Very Scarce Data

no code implementations21 Mar 2019 Juan Maroñas, Roberto Paredes, Daniel Ramos

The goal of this paper is to deal with a data scarcity scenario where deep learning techniques use to fail.

General Classification

Offline Deep models calibration with bayesian neural networks

no code implementations27 Sep 2018 Juan Maroñas, Roberto Paredes, Daniel Ramos

We apply Bayesian Neural Networks to improve calibration of state-of-the-art deep neural networks.

Passive-Aggressive online learning with nonlinear embeddings

1 code implementation Pattern Recognition 2018 Javier Jorge, Roberto Paredes

Nowadays, there is an increasing demand for machine learning techniques which can deal with problems where the instances are produced as a stream or in real time.

Binary Classification

LAYERS: Yet another Neural Network toolkit

no code implementations5 Oct 2016 Roberto Paredes, José-Miguel Benedí

Layers is an open source neural network toolkit aim at providing an easy way to implement modern neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.