Search Results for author: Atsutoshi Kumagai

Found 18 papers, 3 papers with code

Analysis of Linear Mode Connectivity via Permutation-Based Weight Matching

no code implementations6 Feb 2024 Akira Ito, Masanori Yamada, Atsutoshi Kumagai

This finding shows that permutations found by WM mainly align the directions of singular vectors associated with large singular values across models.

Linear Mode Connectivity

Meta-learning to Calibrate Gaussian Processes with Deep Kernels for Regression Uncertainty Estimation

no code implementations13 Dec 2023 Tomoharu Iwata, Atsutoshi Kumagai

We propose a meta-learning method for calibrating deep kernel GPs for improving regression uncertainty estimation performance with a limited number of training data.

Gaussian Processes Meta-Learning +1

Meta-learning of semi-supervised learning from tasks with heterogeneous attribute spaces

no code implementations9 Nov 2023 Tomoharu Iwata, Atsutoshi Kumagai

The proposed method embeds labeled and unlabeled data simultaneously in a task-specific space using a neural network, and the unlabeled data's labels are estimated by adapting classification or regression models in the embedding space.

Attribute Meta-Learning +1

Fast Regularized Discrete Optimal Transport with Group-Sparse Regularizers

no code implementations14 Mar 2023 Yasutoshi Ida, Sekitoshi Kanai, Kazuki Adachi, Atsutoshi Kumagai, Yasuhiro Fujiwara

Regularized discrete optimal transport (OT) is a powerful tool to measure the distance between two discrete distributions that have been constructed from data samples on two different domains.

Unsupervised Domain Adaptation

Transfer Learning with Pre-trained Conditional Generative Models

no code implementations27 Apr 2022 Shin'ya Yamaguchi, Sekitoshi Kanai, Atsutoshi Kumagai, Daiki Chijiwa, Hisashi Kashima

To transfer source knowledge without these assumptions, we propose a transfer learning method that uses deep generative models and is composed of the following two stages: pseudo pre-training (PP) and pseudo semi-supervised learning (P-SSL).

Knowledge Distillation Transfer Learning

Recurrent Neural Networks for Learning Long-term Temporal Dependencies with Reanalysis of Time Scale Representation

no code implementations5 Nov 2021 Kentaro Ohno, Atsutoshi Kumagai

In the mechanism, a forget gate, which was introduced to control information flow in a hidden state in the RNN, has recently been re-interpreted as a representative of the time scale of the state, i. e., a measure how long the RNN retains information on inputs.

Meta-Learning for Relative Density-Ratio Estimation

no code implementations NeurIPS 2021 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

The closed-form solution enables fast and effective adaptation to a few instances, and its differentiability enables us to train our model such that the expected test error for relative DRE can be explicitly minimized after adapting to a few instances.

Density Ratio Estimation Meta-Learning +1

Few-shot Learning for Unsupervised Feature Selection

no code implementations2 Jul 2021 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

We propose a few-shot learning method for unsupervised feature selection, which is a task to select a subset of relevant features in unlabeled data.

feature selection Few-Shot Learning

Meta-learning One-class Classifiers with Eigenvalue Solvers for Supervised Anomaly Detection

no code implementations1 Mar 2021 Tomoharu Iwata, Atsutoshi Kumagai

With a meta-learning framework, quick adaptation to each task and its effective backpropagation are important since the model is trained by the adaptation for each epoch.

Few-Shot Learning One-Class Classification +1

Adversarial Training Makes Weight Loss Landscape Sharper in Logistic Regression

no code implementations5 Feb 2021 Masanori Yamada, Sekitoshi Kanai, Tomoharu Iwata, Tomokatsu Takahashi, Yuki Yamanaka, Hiroshi Takahashi, Atsutoshi Kumagai

We theoretically and experimentally confirm that the weight loss landscape becomes sharper as the magnitude of the noise of adversarial training increases in the linear logistic regression model.

regression

Meta-learning from Tasks with Heterogeneous Attribute Spaces

1 code implementation NeurIPS 2020 Tomoharu Iwata, Atsutoshi Kumagai

We propose a heterogeneous meta-learning method that trains a model on tasks with various attribute spaces, such that it can solve unseen tasks whose attribute spaces are different from the training tasks given a few labeled instances.

Attribute Meta-Learning +1

Few-shot Learning for Time-series Forecasting

no code implementations30 Sep 2020 Tomoharu Iwata, Atsutoshi Kumagai

Forecasting models are usually trained using time-series data in a specific target task.

Few-Shot Learning Time Series +1

Semi-supervised Anomaly Detection on Attributed Graphs

1 code implementation27 Feb 2020 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

To learn node embeddings specialized for anomaly detection, in which there is a class imbalance due to the rarity of anomalies, the parameters of a GCN are trained to minimize the volume of a hypersphere that encloses the node embeddings of normal instances while embedding anomalous ones outside the hypersphere.

Attribute Semi-supervised Anomaly Detection +1

Transfer Anomaly Detection by Inferring Latent Domain Representations

no code implementations NeurIPS 2019 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

The proposed method can infer the anomaly detectors for target domains without re-training by introducing the concept of latent domain vectors, which are latent representations of the domains and are used for inferring the anomaly detectors.

Anomaly Detection

Zero-shot Domain Adaptation without Domain Semantic Descriptors

no code implementations9 Jul 2018 Atsutoshi Kumagai, Tomoharu Iwata

The proposed method can infer appropriate domain-specific models without any semantic descriptors by introducing the concept of latent domain vectors, which are latent representations for the domains and are used for inferring the models.

Domain Adaptation

Cannot find the paper you are looking for? You can Submit a new open access paper.