Search Results for author: Xiangming Meng

Found 16 papers, 5 papers with code

Average case analysis of Lasso under ultra-sparse conditions

no code implementations25 Feb 2023 Koki Okajima, Xiangming Meng, Takashi Takahashi, Yoshiyuki Kabashima

The obtained bound for perfect support recovery is a generalization of that given in previous literature, which only considers the case of Gaussian noise and diverging $d$.

QCM-SGM+: Improved Quantized Compressed Sensing With Score-Based Generative Models

2 code implementations2 Feb 2023 Xiangming Meng, Yoshiyuki Kabashima

In practical compressed sensing (CS), the obtained measurements typically necessitate quantization to a limited number of bits prior to transmission or storage.

Bayesian Inference Quantization

Diffusion Model Based Posterior Sampling for Noisy Linear Inverse Problems

2 code implementations20 Nov 2022 Xiangming Meng, Yoshiyuki Kabashima

We consider the ubiquitous linear inverse problems with additive Gaussian noise and propose an unsupervised sampling approach called diffusion model based posterior sampling (DMPS) to reconstruct the unknown signal from noisy linear measurements.

Colorization Deblurring +2

Quantized Compressed Sensing with Score-based Generative Models

3 code implementations2 Nov 2022 Xiangming Meng, Yoshiyuki Kabashima

We consider the general problem of recovering a high-dimensional signal from noisy quantized measurements.


A Unitary Transform Based Generalized Approximate Message Passing

1 code implementation17 Oct 2022 Jiang Zhu, Xiangming Meng, Xupeng Lei, Qinghua Guo

We consider the problem of recovering an unknown signal ${\mathbf x}\in {\mathbb R}^n$ from general nonlinear measurements obtained through a generalized linear model (GLM), i. e., ${\mathbf y}= f\left({\mathbf A}{\mathbf x}+{\mathbf w}\right)$, where $f(\cdot)$ is a componentwise nonlinear function.

Exact Solutions of a Deep Linear Network

no code implementations10 Feb 2022 Liu Ziyin, Botao Li, Xiangming Meng

This work finds the analytical expression of the global minima of a deep linear network with weight decay and stochastic neurons, a fundamental model for understanding the landscape of neural networks.

Stochastic Neural Networks with Infinite Width are Deterministic

no code implementations30 Jan 2022 Liu Ziyin, HANLIN ZHANG, Xiangming Meng, Yuting Lu, Eric Xing, Masahito Ueda

This work theoretically studies stochastic neural networks, a main type of neural network in use.

On Model Selection Consistency of Lasso for High-Dimensional Ising Models

no code implementations16 Oct 2021 Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima

Moreover, we provide a rigorous proof of the model selection consistency of Lasso with post-thresholding for general tree-like graphs in the paramagnetic phase without further assumptions on the dependency and incoherence conditions.

Model Selection regression +1

Ising Model Selection Using $\ell_{1}$-Regularized Linear Regression: A Statistical Mechanics Analysis

no code implementations NeurIPS 2021 Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima

We theoretically analyze the typical learning performance of $\ell_{1}$-regularized linear regression ($\ell_1$-LinR) for Ising model selection using the replica method from statistical mechanics.

Model Selection regression

Unitary Approximate Message Passing for Sparse Bayesian Learning

no code implementations25 Jan 2021 Man Luo, Qinghua Guo, Ming Jin, Yonina C. Eldar, Defeng, Huang, Xiangming Meng

Sparse Bayesian learning (SBL) can be implemented with low complexity based on the approximate message passing (AMP) algorithm.

Variational Inference

Structure Learning in Inverse Ising Problems Using $\ell_2$-Regularized Linear Estimator

no code implementations19 Aug 2020 Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima

Further, to access the underdetermined region $M < N$, we examine the effect of the $\ell_2$ regularization, and find that biases appear in all the coupling estimates, preventing the perfect identification of the network structure.


Training Restricted Boltzmann Machines with Binary Synapses using the Bayesian Learning Rule

no code implementations9 Jul 2020 Xiangming Meng

However, training RBMs with binary synapses is challenging due to the discrete nature of synapses.

Variational Inference

Training Binary Neural Networks using the Bayesian Learning Rule

4 code implementations ICML 2020 Xiangming Meng, Roman Bachmann, Mohammad Emtiyaz Khan

Our work provides a principled approach for training binary neural networks which justifies and extends existing approaches.

Continual Learning

Vector Approximate Message Passing Algorithm for Structured Perturbed Sensing Matrix

no code implementations26 Aug 2018 Jiang Zhu, Qi Zhang, Xiangming Meng, Zhiwei Xu

In this paper, we consider a general form of noisy compressive sensing (CS) where the sensing matrix is not precisely known.

Signal Processing

A Unified Bayesian Inference Framework for Generalized Linear Models

no code implementations29 Dec 2017 Xiangming Meng, Sheng Wu, Jiang Zhu

In this letter, we present a unified Bayesian inference framework for generalized linear models (GLM) which iteratively reduces the GLM problem to a sequence of standard linear model (SLM) problems.

Information Theory Information Theory

Approximate Message Passing with Nearest Neighbor Sparsity Pattern Learning

no code implementations4 Jan 2016 Xiangming Meng, Sheng Wu, Linling Kuang, Defeng, Huang, Jianhua Lu

We consider the problem of recovering clustered sparse signals with no prior knowledge of the sparsity pattern.

Cannot find the paper you are looking for? You can Submit a new open access paper.