Search Results for author: Lars K. Rasmussen

Found 9 papers, 5 papers with code

$α$ Belief Propagation for Approximate Inference

1 code implementation27 Jun 2020 Dong Liu, Minh Thành Vu, Zuxing Li, Lars K. Rasmussen

To gain a better understanding of BP in general graphs, we derive an interpretable belief propagation algorithm that is motivated by minimization of a localized $\alpha$-divergence.

On Dominant Interference in Random Networks and Communication Reliability

1 code implementation3 Mar 2020 Dong Liu, Baptiste Cavarec, Lars K. Rasmussen, Jing Yue

In this paper, we study the characteristics of dominant interference power with directional reception in a random network modelled by a Poisson Point Process.

Information Theory Signal Processing Information Theory

Powering Hidden Markov Model by Neural Network based Generative Models

1 code implementation13 Oct 2019 Dong Liu, Antoine Honoré, Saikat Chatterjee, Lars K. Rasmussen

In the proposed GenHMM, each HMM hidden state is associated with a neural network based generative model that has tractability of exact likelihood and provides efficient likelihood computation.

$α$ Belief Propagation as Fully Factorized Approximation

no code implementations23 Aug 2019 Dong Liu, Nima N. Moghadam, Lars K. Rasmussen, Jinliang Huang, Saikat Chatterjee

Belief propagation (BP) can do exact inference in loop-free graphs, but its performance could be poor in graphs with loops, and the understanding of its solution is limited.

Entropy-regularized Optimal Transport Generative Models

no code implementations16 Nov 2018 Dong Liu, Minh Thành Vu, Saikat Chatterjee, Lars K. Rasmussen

We investigate the use of entropy-regularized optimal transport (EOT) cost in developing generative models to learn implicit distributions.

Image Generation

Locally Convex Sparse Learning over Networks

no code implementations31 Mar 2018 Ahmed Zaki, Saikat Chatterjee, Partha P. Mitra, Lars K. Rasmussen

Our expectation is that local estimates in each node improve fast and converge, resulting in a limited demand for communication of estimates between nodes and reducing the processing time.

Sparse Learning

Estimate Exchange over Network is Good for Distributed Hard Thresholding Pursuit

no code implementations22 Sep 2017 Ahmed Zaki, Partha P. Mitra, Lars K. Rasmussen, Saikat Chatterjee

The algorithm is iterative and exchanges intermediate estimates of a sparse signal over a network.

Cannot find the paper you are looking for? You can Submit a new open access paper.