In this paper, we study the trade-offs of different inference approaches for
Bayesian matrix factorisation methods, which are commonly used for predicting
missing values, and for finding patterns in the data. In particular, we
consider Bayesian nonnegative variants of matrix factorisation and
tri-factorisation, and compare non-probabilistic inference, Gibbs sampling,
variational Bayesian inference, and a maximum-a-posteriori approach...
variational approach is new for the Bayesian nonnegative models. We compare
their convergence, and robustness to noise and sparsity of the data, on both
synthetic and real-world datasets. Furthermore, we extend the models with the
Bayesian automatic relevance determination prior, allowing the models to
perform automatic model selection, and demonstrate its efficiency.