Search Results for author: Felipe Tobar

Found 25 papers, 7 papers with code

Asynchronous Graph Generators

no code implementations29 Sep 2023 Christopher P. Ley, Felipe Tobar

We introduce the asynchronous graph generator (AGG), a novel graph neural network architecture for multi-channel time series which models observations as nodes on a dynamic graph and can thus perform data imputation by transductive node generation.

Data Augmentation Imputation +1

Greedy online change point detection

no code implementations14 Aug 2023 Jou-Hui Ho, Felipe Tobar

Standard online change point detection (CPD) methods tend to have large false discovery rates as their detections are sensitive to outliers.

Change Point Detection Time Series

Gaussian process deconvolution

1 code implementation8 May 2023 Felipe Tobar, Arnaud Robert, Jorge F. Silva

Let us consider the deconvolution problem, that is, to recover a latent source $x(\cdot)$ from the observations $\mathbf{y} = [y_1,\ldots, y_N]$ of a convolution process $y = x\star h + \eta$, where $\eta$ is an additive noise, the observations in $\mathbf{y}$ might have missing parts with respect to $y$, and the filter $h$ could be unknown.

Computationally-efficient initialisation of GPs: The generalised variogram method

1 code implementation11 Oct 2022 Felipe Tobar, Elsa Cazelles, Taco de Wolff

We present a computationally-efficient strategy to initialise the hyperparameters of a Gaussian process (GP) avoiding the computation of the likelihood function.

Nonstationary multi-output Gaussian processes via harmonizable spectral mixtures

no code implementations18 Feb 2022 Matías Altamirano, Felipe Tobar

Kernel design for Multi-output Gaussian Processes (MOGP) has received increased attention recently.

Gaussian Processes

Studying the Interplay between Information Loss and Operation Loss in Representations for Classification

no code implementations30 Dec 2021 Jorge F. Silva, Felipe Tobar, Mario Vicuña, Felipe Cordova

From this, our main result shows that a specific form of vanishing information loss (a weak notion of asymptotic informational sufficiency) implies a vanishing MPE loss (or asymptotic operational sufficiency) when considering a general family of lossy continuous representations.

Quantization

Bayesian autoregressive spectral estimation

no code implementations5 Oct 2021 Alejandro Cuevas, Sebastián López, Danilo Mandic, Felipe Tobar

Autoregressive (AR) time series models are widely used in parametric spectral estimation (SE), where the power spectral density (PSD) of the time series is approximated by that of the \emph{best-fit} AR model, which is available in closed form.

Time Series Time Series Analysis

Late reverberation suppression using U-nets

1 code implementation5 Oct 2021 Diego León, Felipe Tobar

In real-world settings, speech signals are almost always affected by reverberation produced by the working environment; these corrupted signals need to be \emph{dereverberated} prior to performing, e. g., speech recognition, speech-to-text conversion, compression, or general audio enhancement.

Speech Dereverberation speech-recognition +1

A novel notion of barycenter for probability distributions based on optimal weak mass transport

no code implementations NeurIPS 2021 Elsa Cazelles, Felipe Tobar, Joaquín Fontbona

We introduce weak barycenters of a family of probability distributions, based on the recently developed notion of optimal weak transport of mass by Gozlanet al. (2017) and Backhoff-Veraguas et al. (2020).

Bayesian Reconstruction of Fourier Pairs

1 code implementation9 Nov 2020 Felipe Tobar, Lerko Araya-Hernández, Pablo Huijse, Petar M. Djurić

Our aim is to address the lack of a principled treatment of data acquired indistinctly in the temporal and frequency domains in a way that is robust to missing or noisy observations, and that at the same time models uncertainty effectively.

Astronomy Audio Compression

Gaussian process imputation of multiple financial series

no code implementations11 Feb 2020 Taco de Wolff, Alejandro Cuevas, Felipe Tobar

In Financial Signal Processing, multiple time series such as financial indicators, stock prices and exchange rates are strongly coupled due to their dependence on the latent state of the market and therefore they are required to be jointly analysed.

Imputation Time Series +1

MOGPTK: The Multi-Output Gaussian Process Toolkit

1 code implementation9 Feb 2020 Taco de Wolff, Alejandro Cuevas, Felipe Tobar

We present MOGPTK, a Python package for multi-channel data modelling using Gaussian processes (GP).

Gaussian Processes Imputation

The Wasserstein-Fourier Distance for Stationary Time Series

1 code implementation11 Dec 2019 Elsa Cazelles, Arnaud Robert, Felipe Tobar

The WF distance operates by calculating the Wasserstein distance between the (normalised) power spectral densities (NPSD) of time series.

Data Augmentation Dimensionality Reduction +3

Band-Limited Gaussian Processes: The Sinc Kernel

no code implementations NeurIPS 2019 Felipe Tobar

We propose a novel class of Gaussian processes (GPs) whose spectra have compact support, meaning that their sample trajectories are almost-surely band limited.

Gaussian Processes

Compositionally-Warped Gaussian Processes

no code implementations23 Jun 2019 Gonzalo Rios, Felipe Tobar

The Gaussian process (GP) is a nonparametric prior distribution over functions indexed by time, space, or other high-dimensional index set.

Computational Efficiency Gaussian Processes

Low-pass filtering as Bayesian inference

no code implementations9 Feb 2019 Cristobal Valenzuela, Felipe Tobar

We propose a Bayesian nonparametric method for low-pass filtering that can naturally handle unevenly-sampled and noise-corrupted observations.

Bayesian Inference Gaussian Processes +2

Bayesian Nonparametric Spectral Estimation

1 code implementation NeurIPS 2018 Felipe Tobar

Spectral estimation (SE) aims to identify how the energy of a signal (e. g., a time series) is distributed across different frequencies.

Time Series Time Series Analysis

Bayesian Learning with Wasserstein Barycenters

no code implementations28 May 2018 Julio Backhoff-Veraguas, Joaquin Fontbona, Gonzalo Rios, Felipe Tobar

We introduce and study a novel model-selection strategy for Bayesian learning, based on optimal transport, along with its associated predictive posterior law: the Wasserstein population barycenter of the posterior law over models.

Model Selection

Learning non-Gaussian Time Series using the Box-Cox Gaussian Process

no code implementations19 Mar 2018 Gonzalo Rios, Felipe Tobar

Gaussian processes (GPs) are Bayesian nonparametric generative models that provide interpretability of hyperparameters, admit closed-form expressions for training and inference, and are able to accurately represent uncertainty.

Gaussian Processes Time Series +1

Spectral Mixture Kernels for Multi-Output Gaussian Processes

no code implementations NeurIPS 2017 Gabriel Parra, Felipe Tobar

Early approaches to multiple-output Gaussian processes (MOGPs) relied on linear combinations of independent, latent, single-output Gaussian processes (GPs).

Gaussian Processes

Recovering Latent Signals from a Mixture of Measurements using a Gaussian Process Prior

no code implementations19 Jul 2017 Felipe Tobar, Gonzalo Rios, Tomás Valdivia, Pablo Guerrero

The proposed model is validated in the recovery of three signals: a smooth synthetic signal, a real-world heart-rate time series and a step function, where GPMM outperformed the standard GP in terms of estimation error, uncertainty representation and recovery of the spectral content of the latent signal.

Bayesian Inference Time Series +1

Improving Sparsity in Kernel Adaptive Filters Using a Unit-Norm Dictionary

no code implementations13 Jul 2017 Felipe Tobar

Kernel adaptive filters, a class of adaptive nonlinear time-series models, are known by their ability to learn expressive autoregressive patterns from sequential data.

Time Series Time Series Analysis

Initialising Kernel Adaptive Filters via Probabilistic Inference

no code implementations11 Jul 2017 Iván Castro, Cristóbal Silva, Felipe Tobar

We present a probabilistic framework for both (i) determining the initial settings of kernel adaptive filters (KAFs) and (ii) constructing fully-adaptive KAFs whereby in addition to weights and dictionaries, kernel parameters are learnt sequentially.

Time Series Time Series Analysis

Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels

no code implementations NeurIPS 2015 Felipe Tobar, Thang D. Bui, Richard E. Turner

We introduce the Gaussian Process Convolution Model (GPCM), a two-stage nonparametric generative procedure to model stationary signals as the convolution between a continuous-time white-noise process and a continuous-time linear filter drawn from Gaussian process.

Denoising Gaussian Processes +3

Cannot find the paper you are looking for? You can Submit a new open access paper.