no code implementations • 16 Oct 2021 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
Moreover, we provide a rigorous proof of the model selection consistency of Lasso with post-thresholding for general tree-like graphs in the paramagnetic phase without further assumptions on the dependency and incoherence conditions.
no code implementations • NeurIPS 2021 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
We theoretically analyze the typical learning performance of $\ell_{1}$-regularized linear regression ($\ell_1$-LinR) for Ising model selection using the replica method from statistical mechanics.
no code implementations • 19 Aug 2020 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
Further, to access the underdetermined region $M < N$, we examine the effect of the $\ell_2$ regularization, and find that biases appear in all the coupling estimates, preventing the perfect identification of the network structure.
no code implementations • 7 Aug 2020 • Kao Hayashi, Tomoyuki Obuchi, Yoshiyuki Kabashima
We propose a Monte-Carlo-based method for reconstructing sparse signals in the formulation of sparse linear regression in a high-dimensional setting.
no code implementations • 25 Dec 2019 • Alia Abbara, Yoshiyuki Kabashima, Tomoyuki Obuchi, Yingying Xu
These results are considered to be exact in the thermodynamic limit on locally tree-like networks, such as the regular random or Erd\H{o}s--R\'enyi graphs.
no code implementations • 14 Jun 2019 • Muneki Yasuda, Tomoyuki Obuchi
In this study, we consider an empirical Bayes method for Boltzmann machines and propose an algorithm for it.
1 code implementation • 27 Feb 2019 • Tomoyuki Obuchi, Ayaka Sakata
Second, we develop an approximate formula efficiently computing the cross-validation error without actually conducting the cross-validation, which is also applicable to the non-i. i. d.
no code implementations • 20 Feb 2019 • Ayaka Sakata, Tomoyuki Obuchi
A part of the discrepancy is resolved by introducing the control of the nonconvexity parameters to guide the AMP trajectory to the basin of the attraction.
no code implementations • NeurIPS 2018 • Tatsuro Kawamoto, Masashi Tsubaki, Tomoyuki Obuchi
A theoretical performance analysis of the graph neural network (GNN) is presented.
no code implementations • 29 May 2018 • Tomoyuki Obuchi, Yoshinori Nakanishi-Ohno, Masato Okada, Yoshiyuki Kabashima
The analysis is conducted through evaluation of the entropy, an exponential rate of the number of combinations of variables giving a specific value of fit error to given data which is assumed to be generated from a linear process using the design matrix.
1 code implementation • 28 Feb 2018 • Tomoyuki Obuchi, Yoshiyuki Kabashima
An approximate method for conducting resampling in Lasso, the $\ell_1$ penalized linear regression, in a semi-analytic manner is developed, whereby the average over the resampled datasets is directly computed without repeated numerical sampling, thus enabling an inference free of the statistical fluctuations due to sampling finiteness, as well as a significant reduction of computational time.
2 code implementations • 15 Nov 2017 • Tomoyuki Obuchi, Yoshiyuki Kabashima
We develop an approximate formula for evaluating a cross-validation estimator of predictive likelihood for multinomial logistic regression regularized by an $\ell_1$-norm.
no code implementations • 25 Oct 2016 • Yoshiyuki Kabashima, Tomoyuki Obuchi, Makoto Uemura
Cross-validation (CV) is a technique for evaluating the ability of statistical models/learning systems based on a given data set.
no code implementations • 16 Dec 2014 • Tomoyuki Obuchi, Hirokazu Koma, Muneki Yasuda
Prior distributions of binarized natural images are learned by using a Boltzmann machine.