Search Results for author: Yoshiyuki Kabashima

Found 24 papers, 6 papers with code

Diffusion Model Based Posterior Sampling for Noisy Linear Inverse Problems

2 code implementations20 Nov 2022 Xiangming Meng, Yoshiyuki Kabashima

We consider the ubiquitous linear inverse problems with additive Gaussian noise and propose an unsupervised general-purpose sampling approach called diffusion model based posterior sampling (DMPS) to reconstruct the unknown signal from noisy linear measurements.

Colorization Deblurring +2

Quantized Compressed Sensing with Score-Based Generative Models

2 code implementations2 Nov 2022 Xiangming Meng, Yoshiyuki Kabashima

We consider the general problem of recovering a high-dimensional signal from noisy quantized measurements.

Quantization

Compressed sensing radar detectors under the row-orthogonal design model: a statistical mechanics perspective

no code implementations30 Sep 2022 Siqi Na, Tianyao Huang, Yimin Liu, Takashi Takahashi, Yoshiyuki Kabashima, Xiqin Wang

Such detector can analytically provide the threshold according to given false alarm rate, which is not possible with the conventional CS detector, and the detection performance is proved to be better than that of the traditional LASSO detector.

On Model Selection Consistency of Lasso for High-Dimensional Ising Models

no code implementations16 Oct 2021 Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima

Moreover, we provide a rigorous proof of the model selection consistency of Lasso with post-thresholding for general tree-like graphs in the paramagnetic phase without further assumptions on the dependency and incoherence conditions.

Model Selection regression

Matrix completion based on Gaussian parameterized belief propagation

no code implementations1 May 2021 Koki Okajima, Yoshiyuki Kabashima

We develop a message-passing algorithm for noisy matrix completion problems based on matrix factorization.

Matrix Completion

Ising Model Selection Using $\ell_{1}$-Regularized Linear Regression: A Statistical Mechanics Analysis

no code implementations NeurIPS 2021 Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima

We theoretically analyze the typical learning performance of $\ell_{1}$-regularized linear regression ($\ell_1$-LinR) for Ising model selection using the replica method from statistical mechanics.

Model Selection regression

Structure Learning in Inverse Ising Problems Using $\ell_2$-Regularized Linear Estimator

no code implementations19 Aug 2020 Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima

Further, to access the underdetermined region $M < N$, we examine the effect of the $\ell_2$ regularization, and find that biases appear in all the coupling estimates, preventing the perfect identification of the network structure.

regression

Reconstructing Sparse Signals via Greedy Monte-Carlo Search

no code implementations7 Aug 2020 Kao Hayashi, Tomoyuki Obuchi, Yoshiyuki Kabashima

We propose a Monte-Carlo-based method for reconstructing sparse signals in the formulation of sparse linear regression in a high-dimensional setting.

Learning performance in inverse Ising problems with sparse teacher couplings

no code implementations25 Dec 2019 Alia Abbara, Yoshiyuki Kabashima, Tomoyuki Obuchi, Yingying Xu

These results are considered to be exact in the thermodynamic limit on locally tree-like networks, such as the regular random or Erd\H{o}s--R\'enyi graphs.

Approximate matrix completion based on cavity method

no code implementations29 Jun 2019 Chihiro Noguchi, Yoshiyuki Kabashima

In order to solve large matrix completion problems with practical computational cost, an approximate approach based on matrix factorization has been widely used.

Matrix Completion Scheduling

Replicated Vector Approximate Message Passing For Resampling Problem

no code implementations23 May 2019 Takashi Takahashi, Yoshiyuki Kabashima

Resampling techniques are widely used in statistical inference and ensemble learning, in which estimators' statistical properties are essential.

Ensemble Learning Variable Selection

High-dimensional structure learning of binary pairwise Markov networks: A comparative numerical study

no code implementations14 Jan 2019 Johan Pensar, Yingying Xu, Santeri Puranen, Maiju Pesonen, Yoshiyuki Kabashima, Jukka Corander

Learning the undirected graph structure of a Markov network from data is a problem that has received a lot of attention during the last few decades.

Statistical mechanical analysis of sparse linear regression as a variable selection problem

no code implementations29 May 2018 Tomoyuki Obuchi, Yoshinori Nakanishi-Ohno, Masato Okada, Yoshiyuki Kabashima

The analysis is conducted through evaluation of the entropy, an exponential rate of the number of combinations of variables giving a specific value of fit error to given data which is assumed to be generated from a linear process using the design matrix.

regression Variable Selection

Semi-Analytic Resampling in Lasso

1 code implementation28 Feb 2018 Tomoyuki Obuchi, Yoshiyuki Kabashima

An approximate method for conducting resampling in Lasso, the $\ell_1$ penalized linear regression, in a semi-analytic manner is developed, whereby the average over the resampled datasets is directly computed without repeated numerical sampling, thus enabling an inference free of the statistical fluctuations due to sampling finiteness, as well as a significant reduction of computational time.

Variable Selection

Accelerating Cross-Validation in Multinomial Logistic Regression with $\ell_1$-Regularization

2 code implementations15 Nov 2017 Tomoyuki Obuchi, Yoshiyuki Kabashima

We develop an approximate formula for evaluating a cross-validation estimator of predictive likelihood for multinomial logistic regression regularized by an $\ell_1$-norm.

BIG-bench Machine Learning regression

Approximate cross-validation formula for Bayesian linear regression

no code implementations25 Oct 2016 Yoshiyuki Kabashima, Tomoyuki Obuchi, Makoto Uemura

Cross-validation (CV) is a technique for evaluating the ability of statistical models/learning systems based on a given data set.

regression

Comparative analysis on the selection of number of clusters in community detection

2 code implementations24 Jun 2016 Tatsuro Kawamoto, Yoshiyuki Kabashima

We conduct a comparative analysis on various estimates of the number of clusters in community detection.

Social and Information Networks Physics and Society

Cross-validation estimate of the number of clusters in a network

1 code implementation25 May 2016 Tatsuro Kawamoto, Yoshiyuki Kabashima

Network science investigates methodologies that summarise relational data to obtain better interpretability.

Social and Information Networks Physics and Society

Origin of the computational hardness for learning with binary synapses

no code implementations8 Aug 2014 Haiping Huang, Yoshiyuki Kabashima

Supervised learning in a binary perceptron is able to classify an extensive number of random patterns by a proper assignment of binary synaptic weights.

Phase transitions and sample complexity in Bayes-optimal matrix factorization

no code implementations6 Feb 2014 Yoshiyuki Kabashima, Florent Krzakala, Marc Mézard, Ayaka Sakata, Lenka Zdeborová

We use the tools of statistical mechanics - the cavity and replica methods - to analyze the achievability and computational tractability of the inference problems in the setting of Bayes-optimal inference, which amounts to assuming that the two matrices have random independent elements generated from some known distribution, and this information is available to the inference algorithm.

Dictionary Learning Low-Rank Matrix Completion +1

Entropy landscape of solutions in the binary perceptron problem

no code implementations10 Apr 2013 Haiping Huang, K. Y. Michael Wong, Yoshiyuki Kabashima

The geometrical organization is elucidated by the entropy landscape from a reference configuration and of solution-pairs separated by a given Hamming distance in the solution space.

Sample Complexity of Bayesian Optimal Dictionary Learning

no code implementations26 Jan 2013 Ayaka Sakata, Yoshiyuki Kabashima

We consider a learning problem of identifying a dictionary matrix D (M times N dimension) from a sample set of M dimensional vectors Y = N^{-1/2} DX, where X is a sparse matrix (N times P dimension) in which the density of non-zero entries is 0<rho< 1.

Dictionary Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.