In this work we model the multivariate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow.
We introduce a hierarchical Bayesian approach to tackle the challenging problem of size recommendation in e-commerce fashion.
The library widens the scope of dictionary learning approaches beyond implementations of standard approaches such as ICA, NMF or standard L1 sparse coding.
To alleviate this problem, we propose a deep learning based content-collaborative methodology for personalized size and fit recommendation.
This helps the bandit framework to select the best agents early, since these rewards are smoother and less sparse than the environment reward.
This work explores maximum likelihood optimization of neural networks through hypernetworks.
As example model we use spike-and-slab sparse coding for V1 processing, and combine latent subspace selection with Gibbs sampling (select-and-sample).
This results in powerful though very complex models that are hard to train and that demand additional labels for optimal parameter tuning, which are often not given when labeled data is very sparse.
We investigate two approaches to optimize the parameters of spike-and-slab sparse coding: a novel truncated EM approach and, for comparison, an approach based on standard factored variational distributions.