Search Results for author: Marcelo Bertalmío

Found 6 papers, 1 papers with code

Using Decoupled Features for Photo-realistic Style Transfer

1 code implementation5 Dec 2022 Trevor D. Canham, Adrián Martín, Marcelo Bertalmío, Javier Portilla

In this work we propose a photorealistic style transfer method for image and video that is based on vision science principles and on a recent mathematical formulation for the deterministic decoupling of sample statistics.

Style Transfer

Matching visual induction effects on screens of different size

no code implementations6 May 2020 Trevor D. Canham, Javier Vazquez-Corral, Elise Mathieu, Marcelo Bertalmío

In the film industry, the same movie is expected to be watched on displays of vastly different sizes, from cinema screens to mobile phones.

Synthesizing Visual Illusions Using Generative Adversarial Networks

no code implementations21 Nov 2019 Alexander Gomez-Villa, Adrian Martín, Javier Vazquez-Corral, Jesús Malo, Marcelo Bertalmío

Visual illusions are a very useful tool for vision scientists, because they allow them to better probe the limits, thresholds and errors of the visual system.

Generative Adversarial Network

Cortical-inspired Wilson-Cowan-type equations for orientation-dependent contrast perception modelling

no code implementations15 Oct 2019 Marcelo Bertalmío, Luca Calatroni, Valentina Franceschi, Benedetta Franceschiello, Dario Prandi

We consider the evolution model proposed in [9, 6] to describe illusory contrast perception phenomena induced by surrounding orientations.

A cortical-inspired model for orientation-dependent contrast perception: a link with Wilson-Cowan equations

no code implementations18 Dec 2018 Marcelo Bertalmío, Luca Calatroni, Valentina Franceschi, Benedetta Franceschiello, Dario Prandi

We consider a differential model describing neuro-physiological contrast perception phenomena induced by surrounding orientations.

Convolutional Neural Networks Deceived by Visual Illusions

no code implementations26 Nov 2018 Alexander Gomez-Villa, Adrián Martín, Javier Vazquez-Corral, Marcelo Bertalmío

In particular, we show that CNNs trained for image denoising, image deblurring, and computational color constancy are able to replicate the human response to visual illusions, and that the extent of this replication varies with respect to variation in architecture and spatial pattern size.

Color Constancy Deblurring +2

Cannot find the paper you are looking for? You can Submit a new open access paper.