Search Results for author: Toshiyuki Tanaka

Found 13 papers, 0 papers with code

Universality of reservoir systems with recurrent neural networks

no code implementations4 Mar 2024 Hiroki Yasumoto, Toshiyuki Tanaka

Approximation capability of reservoir systems whose reservoir is a recurrent neural network (RNN) is discussed.

Spatio-temporal reconstruction of substance dynamics using compressed sensing in multi-spectral magnetic resonance spectroscopic imaging

no code implementations1 Mar 2024 Utako Yamamoto, Hirohiko Imai, Kei Sano, Masayuki Ohzeki, Tetsuya Matsuda, Toshiyuki Tanaka

The objective of our study is to observe dynamics of multiple substances in vivo with high temporal resolution from multi-spectral magnetic resonance spectroscopic imaging (MRSI) data.

Convergence Analysis of Blurring Mean Shift

no code implementations23 Feb 2024 Ryoya Yamasaki, Toshiyuki Tanaka

Blurring mean shift (BMS) algorithm, a variant of the mean shift algorithm, is a kernel-based iterative method for data clustering, where data points are clustered according to their convergent points via iterative blurring.

Negative-prompt Inversion: Fast Image Inversion for Editing with Text-guided Diffusion Models

no code implementations26 May 2023 Daiki Miyake, Akihiro Iohara, Yu Saito, Toshiyuki Tanaka

In image editing employing diffusion models, it is crucial to preserve the reconstruction quality of the original image while changing its style.

Text-based Image Editing

Convergence Analysis of Mean Shift

no code implementations15 May 2023 Ryoya Yamasaki, Toshiyuki Tanaka

The mean shift (MS) algorithm seeks a mode of the kernel density estimate (KDE).

Label Smoothing is Robustification against Model Misspecification

no code implementations15 May 2023 Ryoya Yamasaki, Toshiyuki Tanaka

For example, in binary classification, instead of the one-hot target $(1, 0)^\top$ used in conventional logistic regression (LR), LR with LS (LSLR) uses the smoothed target $(1-\frac{\alpha}{2},\frac{\alpha}{2})^\top$ with a smoothing level $\alpha\in(0, 1)$, which causes squeezing of values of the logit.

Binary Classification

Optimal Kernel for Kernel-Based Modal Statistical Methods

no code implementations20 Apr 2023 Ryoya Yamasaki, Toshiyuki Tanaka

Kernel-based modal statistical methods include mode estimation, regression, and clustering.

Clustering regression

Aggregated Multi-output Gaussian Processes with Knowledge Transfer Across Domains

no code implementations24 Jun 2022 Yusuke Tanaka, Toshiyuki Tanaka, Tomoharu Iwata, Takeshi Kurashima, Maya Okawa, Yasunori Akagi, Hiroyuki Toda

Since the supports may have various granularities depending on attributes (e. g., poverty rate and crime rate), modeling such data is not straightforward.

Attribute Gaussian Processes +2

Kernel Selection for Modal Linear Regression: Optimal Kernel and IRLS Algorithm

no code implementations30 Jan 2020 Ryoya Yamasaki, Toshiyuki Tanaka

Modal linear regression (MLR) is a method for obtaining a conditional mode predictor as a linear model.

regression

Spatially Aggregated Gaussian Processes with Multivariate Areal Outputs

no code implementations NeurIPS 2019 Yusuke Tanaka, Toshiyuki Tanaka, Tomoharu Iwata, Takeshi Kurashima, Maya Okawa, Yasunori Akagi, Hiroyuki Toda

By deriving the posterior GP, we can predict the data value at any location point by considering the spatial correlations and the dependences between areal data sets, simultaneously.

Gaussian Processes Transfer Learning

Refining Coarse-grained Spatial Data using Auxiliary Spatial Data Sets with Various Granularities

no code implementations21 Sep 2018 Yusuke Tanaka, Tomoharu Iwata, Toshiyuki Tanaka, Takeshi Kurashima, Maya Okawa, Hiroyuki Toda

With the proposed model, a distribution for each auxiliary data set on the continuous space is modeled using a Gaussian process, where the representation of uncertainty considers the levels of granularity.

Gaussian Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.