no code implementations • 27 Dec 2023 • Makiko Konoshima, Hirotaka Tamura, Yoshiyuki Kabashima
The 0/1 matrix factorization defines matrix products using logical AND and OR as product-sum operators, revealing the factors influencing various decision processes.
no code implementations • 30 Jun 2023 • Siqi Na, Yoshiyuki Kabashima, Takashi Takahashi, Tianyao Huang, Yimin Liu, Xiqin Wang
Based on this estimator, we construct a detector, termed the debiased weighted LASSO detector (DWLD), for CS radar systems and prove its advantages.
no code implementations • 25 Feb 2023 • Koki Okajima, Xiangming Meng, Takashi Takahashi, Yoshiyuki Kabashima
The obtained bound for perfect support recovery is a generalization of that given in previous literature, which only considers the case of Gaussian noise and diverging $d$.
2 code implementations • 2 Feb 2023 • Xiangming Meng, Yoshiyuki Kabashima
In practical compressed sensing (CS), the obtained measurements typically necessitate quantization to a limited number of bits prior to transmission or storage.
2 code implementations • 20 Nov 2022 • Xiangming Meng, Yoshiyuki Kabashima
We consider the ubiquitous linear inverse problems with additive Gaussian noise and propose an unsupervised sampling approach called diffusion model based posterior sampling (DMPS) to reconstruct the unknown signal from noisy linear measurements.
3 code implementations • 2 Nov 2022 • Xiangming Meng, Yoshiyuki Kabashima
We consider the general problem of recovering a high-dimensional signal from noisy quantized measurements.
no code implementations • 30 Sep 2022 • Siqi Na, Tianyao Huang, Yimin Liu, Takashi Takahashi, Yoshiyuki Kabashima, Xiqin Wang
Such detector can analytically provide the threshold according to given false alarm rate, which is not possible with the conventional CS detector, and the detection performance is proved to be better than that of the traditional LASSO detector.
no code implementations • 16 Oct 2021 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
Moreover, we provide a rigorous proof of the model selection consistency of Lasso with post-thresholding for general tree-like graphs in the paramagnetic phase without further assumptions on the dependency and incoherence conditions.
no code implementations • 1 May 2021 • Koki Okajima, Yoshiyuki Kabashima
We develop a message-passing algorithm for noisy matrix completion problems based on matrix factorization.
no code implementations • NeurIPS 2021 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
We theoretically analyze the typical learning performance of $\ell_{1}$-regularized linear regression ($\ell_1$-LinR) for Ising model selection using the replica method from statistical mechanics.
no code implementations • 19 Aug 2020 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
Further, to access the underdetermined region $M < N$, we examine the effect of the $\ell_2$ regularization, and find that biases appear in all the coupling estimates, preventing the perfect identification of the network structure.
no code implementations • 7 Aug 2020 • Kao Hayashi, Tomoyuki Obuchi, Yoshiyuki Kabashima
We propose a Monte-Carlo-based method for reconstructing sparse signals in the formulation of sparse linear regression in a high-dimensional setting.
no code implementations • 19 Mar 2020 • Takashi Takahashi, Yoshiyuki Kabashima
We consider the variable selection problem of generalized linear models (GLMs).
no code implementations • 25 Dec 2019 • Alia Abbara, Yoshiyuki Kabashima, Tomoyuki Obuchi, Yingying Xu
These results are considered to be exact in the thermodynamic limit on locally tree-like networks, such as the regular random or Erd\H{o}s--R\'enyi graphs.
no code implementations • 29 Jun 2019 • Chihiro Noguchi, Yoshiyuki Kabashima
In order to solve large matrix completion problems with practical computational cost, an approximate approach based on matrix factorization has been widely used.
no code implementations • 23 May 2019 • Takashi Takahashi, Yoshiyuki Kabashima
Resampling techniques are widely used in statistical inference and ensemble learning, in which estimators' statistical properties are essential.
no code implementations • 14 Jan 2019 • Johan Pensar, Yingying Xu, Santeri Puranen, Maiju Pesonen, Yoshiyuki Kabashima, Jukka Corander
Learning the undirected graph structure of a Markov network from data is a problem that has received a lot of attention during the last few decades.
no code implementations • 29 May 2018 • Tomoyuki Obuchi, Yoshinori Nakanishi-Ohno, Masato Okada, Yoshiyuki Kabashima
The analysis is conducted through evaluation of the entropy, an exponential rate of the number of combinations of variables giving a specific value of fit error to given data which is assumed to be generated from a linear process using the design matrix.
1 code implementation • 28 Feb 2018 • Tomoyuki Obuchi, Yoshiyuki Kabashima
An approximate method for conducting resampling in Lasso, the $\ell_1$ penalized linear regression, in a semi-analytic manner is developed, whereby the average over the resampled datasets is directly computed without repeated numerical sampling, thus enabling an inference free of the statistical fluctuations due to sampling finiteness, as well as a significant reduction of computational time.
2 code implementations • 15 Nov 2017 • Tomoyuki Obuchi, Yoshiyuki Kabashima
We develop an approximate formula for evaluating a cross-validation estimator of predictive likelihood for multinomial logistic regression regularized by an $\ell_1$-norm.
no code implementations • 25 Oct 2016 • Yoshiyuki Kabashima, Tomoyuki Obuchi, Makoto Uemura
Cross-validation (CV) is a technique for evaluating the ability of statistical models/learning systems based on a given data set.
2 code implementations • 24 Jun 2016 • Tatsuro Kawamoto, Yoshiyuki Kabashima
We conduct a comparative analysis on various estimates of the number of clusters in community detection.
Social and Information Networks Physics and Society
1 code implementation • 25 May 2016 • Tatsuro Kawamoto, Yoshiyuki Kabashima
Network science investigates methodologies that summarise relational data to obtain better interpretability.
Social and Information Networks Physics and Society
no code implementations • 8 Aug 2014 • Haiping Huang, Yoshiyuki Kabashima
Supervised learning in a binary perceptron is able to classify an extensive number of random patterns by a proper assignment of binary synaptic weights.
no code implementations • 6 Feb 2014 • Yoshiyuki Kabashima, Florent Krzakala, Marc Mézard, Ayaka Sakata, Lenka Zdeborová
We use the tools of statistical mechanics - the cavity and replica methods - to analyze the achievability and computational tractability of the inference problems in the setting of Bayes-optimal inference, which amounts to assuming that the two matrices have random independent elements generated from some known distribution, and this information is available to the inference algorithm.
no code implementations • 10 Apr 2013 • Haiping Huang, K. Y. Michael Wong, Yoshiyuki Kabashima
The geometrical organization is elucidated by the entropy landscape from a reference configuration and of solution-pairs separated by a given Hamming distance in the solution space.
no code implementations • 26 Jan 2013 • Ayaka Sakata, Yoshiyuki Kabashima
We consider a learning problem of identifying a dictionary matrix D (M times N dimension) from a sample set of M dimensional vectors Y = N^{-1/2} DX, where X is a sparse matrix (N times P dimension) in which the density of non-zero entries is 0<rho< 1.