Tuning hyperparameters for unsupervised learning problems is difficult in general due to the lack of ground truth for validation.
In this paper, we provide a framework with provable guarantees for selecting hyperparameters in a number of distinct models.
To further extend the utility of the algorithm to high dimensional settings, we construct a proposal with forward and reverse moves using stochastic gradient and show that the construction leads to reasonable acceptance probabilities.
Variational approximation has been widely used in large-scale Bayesian inference recently, the simplest kind of which involves imposing a mean field assumption to approximate complicated latent structures.
However, one of the drawbacks of community detection is that most methods take exchangeability of the nodes in the network for granted; whereas the nodes in this case, i. e. the positions on the chromosomes, are not exchangeable.