To gain a better understanding of BP in general graphs, we derive an interpretable belief propagation algorithm that is motivated by minimization of a localized $\alpha$-divergence.
In this paper, we study the characteristics of dominant interference power with directional reception in a random network modelled by a Poisson Point Process.
Information Theory Signal Processing Information Theory
In the proposed GenHMM, each HMM hidden state is associated with a neural network based generative model that has tractability of exact likelihood and provides efficient likelihood computation.
Belief propagation (BP) can do exact inference in loop-free graphs, but its performance could be poor in graphs with loops, and the understanding of its solution is limited.
We investigate the use of entropy-regularized optimal transport (EOT) cost in developing generative models to learn implicit distributions.
Our expectation is that local estimates in each node improve fast and converge, resulting in a limited demand for communication of estimates between nodes and reducing the processing time.
The algorithm is iterative and exchanges intermediate estimates of a sparse signal over a network.