This makes the restoration of MPM images challenging.
We then show that the $\alpha$-divergence can be approximated by a generalized notion of effective sample size and leverage this new perspective to adapt the tail parameter with Bayesian optimization.
Ensemble learning leverages multiple models (i. e., weak learners) on a common machine learning task to enhance prediction performance.
Several decades ago, Support Vector Machines (SVMs) were introduced for performing binary classification tasks, under a supervised framework.
This work proposes a novel approach to fill this gap by introducing a joint graphical modeling framework that bridges the static graphical Lasso model and a causal-based graphical approach for the linear-Gaussian SSM.
Denoising, detrending, deconvolution: usual restoration tasks, traditionally decoupled.
We relax these assumptions and extend current benchmarks, so that the query-set classes of a given task are unknown, but just belong to a much larger set of possible classes.
The performance of the proposed method is evaluated through the DrugBank dataset, and comparisons are provided against state-of-the-art techniques.
Bayesian neural networks (BNNs) have received an increased interest in the last years.
We unfold the Dual Block coordinate Forward-Backward (DBFB) algorithm, embedded in an iterative reweighted scheme, allowing the learning of key parameters in a supervised manner.
The joint problem of reconstruction / feature extraction is a challenging task in image processing.
In this paper, we introduce a variational Bayesian algorithm (VBA) for image blind deconvolution.
Based on its great successes in inference and denosing tasks, Dictionary Learning (DL) and its related sparse optimization formulations have garnered a lot of research interest.
This work proposes an unsupervised fusion framework based on deep convolutional transform learning.
This work proposes a supervised multi-channel time-series learning framework for financial stock trading.
This work addresses the problem of analyzing multi-channel time series data %.
This work introduces a new unsupervised representation learning technique called Deep Convolutional Transform Learning (DCTL).
This work formulates antiviral repositioning as a matrix completion problem where the antiviral drugs are along the rows and the viruses along the columns.
The main contribution of this work is a manually curated database publicly shared, comprising of existing associations between viruses and their corresponding antivirals.
On account of its many successes in inference tasks and denoising applications, Dictionary Learning (DL) and its related sparse optimization problems have garnered a lot of research interest.
Latent factor models have been used widely in collaborative filtering based recommender systems.
We assume that, even if the raw data is not separable into subspac-es, one can learn a representation (transform coef-ficients) such that the learnt representation is sep-arable into subspaces.
A wide array of machine learning problems are formulated as the minimization of the expectation of a convex loss function on some parameter space.
Variational methods are widely applied to ill-posed inverse problems for they have the ability to embed prior knowledge about the solution.
In this paper, we develop a novel second-order method for training feed-forward neural nets.
In this paper, we propose a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas-Rachford splitting method.
This paper presents a fast approach for penalized least squares (LS) regression problems using a 2D Gaussian Markov random field (GMRF) prior.
We demonstrate the potential of the proposed approach through comparisons with state-of-the-art techniques that are specifically tailored to signal recovery in the presence of mixed Poisson-Gaussian noise.