Search Results for author: Tomoharu Iwata

Found 56 papers, 9 papers with code

Aggregated Multi-output Gaussian Processes with Knowledge Transfer Across Domains

no code implementations24 Jun 2022 Yusuke Tanaka, Toshiyuki Tanaka, Tomoharu Iwata, Takeshi Kurashima, Maya Okawa, Yasunori Akagi, Hiroyuki Toda

Since the supports may have various granularities depending on attributes (e. g., poverty rate and crime rate), modeling such data is not straightforward.

Gaussian Processes Time Series +1

Training Deep Models to be Explained with Fewer Examples

no code implementations7 Dec 2021 Tomoharu Iwata, Yuya Yoshikawa

For improving the interpretability, reducing the number of examples in the explanation model is important.

Evacuation Shelter Scheduling Problem

no code implementations26 Nov 2021 Hitoshi Shimizu, Hirohiko Suwa, Tomoharu Iwata, Akinori Fujino, Hiroshi Sawada, Keiichi Yasumoto

In this study, we formulate the "Evacuation Shelter Scheduling Problem," which allocates evacuees to shelters in such a way to minimize the movement costs of the evacuees and the operation costs of the shelters.

End-to-End Learning of Deep Kernel Acquisition Functions for Bayesian Optimization

no code implementations1 Nov 2021 Tomoharu Iwata

In experiments using three text document datasets, we demonstrate that the proposed method achieves better BO performance than the existing methods.

Gaussian Processes Meta-Learning

Few-shot Learning for Unsupervised Feature Selection

no code implementations2 Jul 2021 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

We propose a few-shot learning method for unsupervised feature selection, which is a task to select a subset of relevant features in unlabeled data.

feature selection Few-Shot Learning

Meta-Learning for Relative Density-Ratio Estimation

no code implementations NeurIPS 2021 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

The closed-form solution enables fast and effective adaptation to a few instances, and its differentiability enables us to train our model such that the expected test error for relative DRE can be explicitly minimized after adapting to a few instances.

Density Ratio Estimation Meta-Learning +1

Meta-learning for Matrix Factorization without Shared Rows or Columns

no code implementations29 Jun 2021 Tomoharu Iwata

The neural network is meta-learned such that the expected imputation error is minimized when the factorized matrices are adapted to each matrix by a maximum a posteriori (MAP) estimation.

Imputation Meta-Learning

Dynamic Hawkes Processes for Discovering Time-evolving Communities' States behind Diffusion Processes

no code implementations24 May 2021 Maya Okawa, Tomoharu Iwata, Yusuke Tanaka, Hiroyuki Toda, Takeshi Kurashima, Hisashi Kashima

Hawkes processes offer a central tool for modeling the diffusion processes, in which the influence from the past events is described by the triggering kernel.

Few-shot Learning for Topic Modeling

no code implementations19 Apr 2021 Tomoharu Iwata

The proposed method trains the neural networks such that the expected test likelihood is improved when topic model parameters are estimated by maximizing the posterior probability using the priors based on the EM algorithm.

Few-Shot Learning Topic Models

Context-aware Neural Machine Translation with Mini-batch Embedding

1 code implementation EACL 2021 Makoto Morishita, Jun Suzuki, Tomoharu Iwata, Masaaki Nagata

It is crucial to provide an inter-sentence context in Neural Machine Translation (NMT) models for higher-quality translation.

Machine Translation Translation

Meta-learning representations for clustering with infinite Gaussian mixture models

no code implementations1 Mar 2021 Tomoharu Iwata

We propose a meta-learning method that train neural networks for obtaining representations such that clustering performance improves when the representations are clustered by the variational Bayesian (VB) inference with an infinite Gaussian mixture model.

Meta-Learning Metric Learning

Meta-learning One-class Classifiers with Eigenvalue Solvers for Supervised Anomaly Detection

no code implementations1 Mar 2021 Tomoharu Iwata, Atsutoshi Kumagai

With a meta-learning framework, quick adaptation to each task and its effective backpropagation are important since the model is trained by the adaptation for each epoch.

Anomaly Detection Few-Shot Learning

Meta-Learning for Koopman Spectral Analysis with Short Time-series

no code implementations9 Feb 2021 Tomoharu Iwata, Yoshinobu Kawahara

With the proposed method, a representation of a given short time-series is obtained by a bidirectional LSTM for extracting its properties.

Future prediction Meta-Learning +1

Adversarial Training Makes Weight Loss Landscape Sharper in Logistic Regression

no code implementations5 Feb 2021 Masanori Yamada, Sekitoshi Kanai, Tomoharu Iwata, Tomokatsu Takahashi, Yuki Yamanaka, Hiroshi Takahashi, Atsutoshi Kumagai

We theoretically and experimentally confirm that the weight loss landscape becomes sharper as the magnitude of the noise of adversarial training increases in the linear logistic regression model.

Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear Dynamics

no code implementations11 Dec 2020 Tomoharu Iwata, Yoshinobu Kawahara

With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition, enabling end-to-end learning of Koopman spectral analysis.

Time Series

Meta-learning from Tasks with Heterogeneous Attribute Spaces

1 code implementation NeurIPS 2020 Tomoharu Iwata, Atsutoshi Kumagai

We propose a heterogeneous meta-learning method that trains a model on tasks with various attribute spaces, such that it can solve unseen tasks whose attribute spaces are different from the training tasks given a few labeled instances.

Meta-Learning Time-Series Few-Shot Learning with Heterogeneous Channels

Meta-Active Learning for Node Response Prediction in Graphs

no code implementations12 Oct 2020 Tomoharu Iwata

Meta-learning is an important approach to improve machine learning performance with a limited number of observations for target tasks.

Active Learning Meta-Learning

Few-shot Learning for Time-series Forecasting

no code implementations30 Sep 2020 Tomoharu Iwata, Atsutoshi Kumagai

Forecasting models are usually trained using time-series data in a specific target task.

Few-Shot Learning Time Series +1

Gaussian Process Regression with Local Explanation

no code implementations3 Jul 2020 Yuya Yoshikawa, Tomoharu Iwata

In the proposed model, both the prediction and explanation for each sample are performed using an easy-to-interpret locally linear model.


Probabilistic Optimal Transport based on Collective Graphical Models

no code implementations16 Jun 2020 Yasunori Akagi, Yusuke Tanaka, Tomoharu Iwata, Takeshi Kurashima, Hiroyuki Toda

In this study, we propose a new framework in which OT is considered as a maximum a posteriori (MAP) solution of a probabilistic generative model.

Neural Generators of Sparse Local Linear Models for Achieving both Accuracy and Interpretability

no code implementations13 Mar 2020 Yuya Yoshikawa, Tomoharu Iwata

Additionally, the prediction is interpretable because it is obtained by the inner product between the simplified representations and the sparse weights, where only a small number of weights are selected by our gate module in the NGSLL.

Text Classification

Semi-supervised Anomaly Detection on Attributed Graphs

1 code implementation27 Feb 2020 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

To learn node embeddings specialized for anomaly detection, in which there is a class imbalance due to the rarity of anomalies, the parameters of a GCN are trained to minimize the volume of a hypersphere that encloses the node embeddings of normal instances while embedding anomalous ones outside the hypersphere.

Anomaly Detection Semi-supervised Anomaly Detection

Transfer Anomaly Detection by Inferring Latent Domain Representations

no code implementations NeurIPS 2019 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

The proposed method can infer the anomaly detectors for target domains without re-training by introducing the concept of latent domain vectors, which are latent representations of the domains and are used for inferring the anomaly detectors.

Anomaly Detection

Efficient Transfer Bayesian Optimization with Auxiliary Information

no code implementations17 Sep 2019 Tomoharu Iwata, Takuma Otsuka

By using the neural network covariance function, we can extract nonlinear correlation among feature vectors that are shared across related tasks.

Anomaly Detection with Inexact Labels

no code implementations11 Sep 2019 Tomoharu Iwata, Machiko Toyoda, Shotaro Tora, Naonori Ueda

We model the anomaly score function by a neural network-based unsupervised anomaly detection method, e. g., autoencoders.

Multiple Instance Learning Unsupervised Anomaly Detection

Spatially Aggregated Gaussian Processes with Multivariate Areal Outputs

no code implementations NeurIPS 2019 Yusuke Tanaka, Toshiyuki Tanaka, Tomoharu Iwata, Takeshi Kurashima, Maya Okawa, Yasunori Akagi, Hiroyuki Toda

By deriving the posterior GP, we can predict the data value at any location point by considering the spatial correlations and the dependences between areal data sets, simultaneously.

Gaussian Processes Transfer Learning

Unsupervised Multilingual Word Embedding with Limited Resources using Neural Language Models

1 code implementation ACL 2019 Takashi Wada, Tomoharu Iwata, Yuji Matsumoto

Recently, a variety of unsupervised methods have been proposed that map pre-trained word embeddings of different languages into the same space without any parallel data.

Word Alignment Word Embeddings

Deep Mixture Point Processes: Spatio-temporal Event Prediction with Rich Contextual Information

no code implementations21 Jun 2019 Maya Okawa, Tomoharu Iwata, Takeshi Kurashima, Yusuke Tanaka, Hiroyuki Toda, Naonori Ueda

Though many point processes have been proposed to model events in a continuous spatio-temporal space, none of them allow for the consideration of the rich contextual factors that affect event occurrence, such as weather, social activities, geographical characteristics, and traffic.

Point Processes

Supervised Anomaly Detection based on Deep Autoregressive Density Estimators

no code implementations12 Apr 2019 Tomoharu Iwata, Yuki Yamanaka

We propose a supervised anomaly detection method based on neural density estimators, where the negative log likelihood is used for the anomaly score.

Density Estimation Unsupervised Anomaly Detection

Autoencoding Binary Classifiers for Supervised Anomaly Detection

no code implementations26 Mar 2019 Yuki Yamanaka, Tomoharu Iwata, Hiroshi Takahashi, Masanori Yamada, Sekitoshi Kanai

Since our approach becomes able to reconstruct the normal data points accurately and fails to reconstruct the known and unknown anomalies, it can accurately discriminate both known and unknown anomalies from normal data points.

Anomaly Detection

Finding Appropriate Traffic Regulations via Graph Convolutional Networks

no code implementations23 Oct 2018 Tomoharu Iwata, Takuma Otsuka, Hitoshi Shimizu, Hiroshi Sawada, Futoshi Naya, Naonori Ueda

In this paper, we propose a method to learn a function that outputs regulation effects given the current traffic situation as inputs.

Unsupervised Object Matching for Relational Data

no code implementations9 Oct 2018 Tomoharu Iwata, Naonori Ueda

The estimated latent vectors contain hidden structural information of each object in the given relational dataset.

Density Estimation

Refining Coarse-grained Spatial Data using Auxiliary Spatial Data Sets with Various Granularities

no code implementations21 Sep 2018 Yusuke Tanaka, Tomoharu Iwata, Toshiyuki Tanaka, Takeshi Kurashima, Maya Okawa, Hiroyuki Toda

With the proposed model, a distribution for each auxiliary data set on the continuous space is modeled using a Gaussian process, where the representation of uncertainty considers the levels of granularity.

Gaussian Processes

Variational Autoencoder with Implicit Optimal Priors

1 code implementation14 Sep 2018 Hiroshi Takahashi, Tomoharu Iwata, Yuki Yamanaka, Masanori Yamada, Satoshi Yagi

However, KL divergence with the aggregated posterior cannot be calculated in a closed form, which prevents us from using this optimal prior.

Density Estimation

Unsupervised Cross-lingual Word Embedding by Multilingual Neural Language Models

no code implementations7 Sep 2018 Takashi Wada, Tomoharu Iwata

The proposed model contains bidirectional LSTMs that perform as forward and backward language models, and these networks are shared among all the languages.

Cross-Lingual Word Embeddings Word Alignment +1

Zero-shot Domain Adaptation without Domain Semantic Descriptors

no code implementations9 Jul 2018 Atsutoshi Kumagai, Tomoharu Iwata

The proposed method can infer appropriate domain-specific models without any semantic descriptors by introducing the concept of latent domain vectors, which are latent representations for the domains and are used for inferring the models.

Domain Adaptation

Localized Lasso for High-Dimensional Regression

no code implementations22 Mar 2016 Makoto Yamada, Koh Takeuchi, Tomoharu Iwata, John Shawe-Taylor, Samuel Kaski

We introduce the localized Lasso, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality $d$ and small sample size $n$.

Cross-Domain Matching for Bag-of-Words Data via Kernel Embeddings of Latent Distributions

no code implementations NeurIPS 2015 Yuya Yoshikawa, Tomoharu Iwata, Hiroshi Sawada, Takeshi Yamada

We propose a kernel-based method for finding matching between instances across different domains, such as multilingual documents and images with annotations.

Latent Support Measure Machines for Bag-of-Words Data Classification

no code implementations NeurIPS 2014 Yuya Yoshikawa, Tomoharu Iwata, Hiroshi Sawada

With the latent SMM, a latent vector is associated with each vocabulary term, and each document is represented as a distribution of the latent vectors for words appearing in the document.

Classification General Classification +1

Multi-view Anomaly Detection via Probabilistic Latent Variable Models

no code implementations13 Nov 2014 Tomoharu Iwata, Makoto Yamada

We propose a nonparametric Bayesian probabilistic latent variable model for multi-view anomaly detection, which is the task of finding instances that have inconsistent views.

Anomaly Detection Bayesian Inference

Warped Mixtures for Nonparametric Cluster Shapes

1 code implementation9 Aug 2014 Tomoharu Iwata, David Duvenaud, Zoubin Ghahramani

A mixture of Gaussians fit to a single curved or heavy-tailed cluster will report that the data contains many clusters.

Density Estimation

Warped Mixtures for Nonparametric Cluster Shapes

1 code implementation8 Jun 2012 Tomoharu Iwata, David Duvenaud, Zoubin Ghahramani

A mixture of Gaussians fit to a single curved or heavy-tailed cluster will report that the data contains many clusters.

Density Estimation

Dynamic Infinite Relational Model for Time-varying Relational Data Analysis

no code implementations NeurIPS 2010 Katsuhiko Ishiguro, Tomoharu Iwata, Naonori Ueda, Joshua B. Tenenbaum

We propose a new probabilistic model for analyzing dynamic evolutions of relational data, such as additions, deletions and split & merge, of relation clusters like communities in social networks.

Modeling Social Annotation Data with Content Relevance using a Topic Model

no code implementations NeurIPS 2009 Tomoharu Iwata, Takeshi Yamada, Naonori Ueda

We propose a probabilistic topic model for analyzing and extracting content-related annotations from noisy annotated discrete data such as web pages stored in social bookmarking services.

General Classification Information Retrieval +1

Cannot find the paper you are looking for? You can Submit a new open access paper.