Search Results for author: Yasuhiro Fujiwara

Found 15 papers, 3 papers with code

GuP: Fast Subgraph Matching by Guard-based Pruning

1 code implementation11 Jun 2023 Junya Arai, Yasuhiro Fujiwara, Makoto Onizuka

Subgraph matching, which finds subgraphs isomorphic to a query, is the key to information retrieval from data represented as a graph.

Information Retrieval Retrieval

Fast Regularized Discrete Optimal Transport with Group-Sparse Regularizers

no code implementations14 Mar 2023 Yasutoshi Ida, Sekitoshi Kanai, Kazuki Adachi, Atsutoshi Kumagai, Yasuhiro Fujiwara

Regularized discrete optimal transport (OT) is a powerful tool to measure the distance between two discrete distributions that have been constructed from data samples on two different domains.

Unsupervised Domain Adaptation

Permuton-induced Chinese Restaurant Process

1 code implementation NeurIPS 2021 Masahiro Nakano, Yasuhiro Fujiwara, Akisato Kimura, Takeshi Yamada, Naonori Ueda

Our main contribution is to introduce the notion of permutons into the well-known Chinese restaurant process (CRP) for sequence partitioning: a permuton is a probability measure on $[0, 1]\times [0, 1]$ and can be regarded as a geometric interpretation of the scaling limit of permutations.

Few-shot Learning for Unsupervised Feature Selection

no code implementations2 Jul 2021 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

We propose a few-shot learning method for unsupervised feature selection, which is a task to select a subset of relevant features in unlabeled data.

feature selection Few-Shot Learning

Meta-Learning for Relative Density-Ratio Estimation

no code implementations NeurIPS 2021 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

The closed-form solution enables fast and effective adaptation to a few instances, and its differentiability enables us to train our model such that the expected test error for relative DRE can be explicitly minimized after adapting to a few instances.

Density Ratio Estimation Meta-Learning +1

Fast Subgraph Matching by Exploiting Search Failures

no code implementations28 Dec 2020 Junya Arai, Makoto Onizuka, Yasuhiro Fujiwara, Sotetsu Iwamura

That is, our algorithm generates failure patterns when a partial embedding is found unable to become an isomorphic embedding.

Databases

Semi-supervised Anomaly Detection on Attributed Graphs

1 code implementation27 Feb 2020 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

To learn node embeddings specialized for anomaly detection, in which there is a class imbalance due to the rarity of anomalies, the parameters of a GCN are trained to minimize the volume of a hypersphere that encloses the node embeddings of normal instances while embedding anomalous ones outside the hypersphere.

Attribute Semi-supervised Anomaly Detection +1

Fast Sparse Group Lasso

no code implementations NeurIPS 2019 Yasutoshi Ida, Yasuhiro Fujiwara, Hisashi Kashima

Block Coordinate Descent is a standard approach to obtain the parameters of Sparse Group Lasso, and iteratively updates the parameters for each parameter group.

Transfer Anomaly Detection by Inferring Latent Domain Representations

no code implementations NeurIPS 2019 Atsutoshi Kumagai, Tomoharu Iwata, Yasuhiro Fujiwara

The proposed method can infer the anomaly detectors for target domains without re-training by introducing the concept of latent domain vectors, which are latent representations of the domains and are used for inferring the anomaly detectors.

Anomaly Detection

Absum: Simple Regularization Method for Reducing Structural Sensitivity of Convolutional Neural Networks

no code implementations19 Sep 2019 Sekitoshi Kanai, Yasutoshi Ida, Yasuhiro Fujiwara, Masanori Yamada, Shuichi Adachi

Furthermore, we reveal that robust CNNs with Absum are more robust against transferred attacks due to decreasing the common sensitivity and against high-frequency noise than standard regularization methods.

Adversarial Attack Adversarial Robustness

Network Implosion: Effective Model Compression for ResNets via Static Layer Pruning and Retraining

no code implementations10 Jun 2019 Yasutoshi Ida, Yasuhiro Fujiwara

Our key idea is to introduce a priority term that identifies the importance of a layer; we can select unimportant layers according to the priority and erase them after the training.

Model Compression

Sigsoftmax: Reanalysis of the Softmax Bottleneck

no code implementations NeurIPS 2018 Sekitoshi Kanai, Yasuhiro Fujiwara, Yuki Yamanaka, Shuichi Adachi

On the basis of this analysis, we propose sigsoftmax, which is composed of a multiplication of an exponential function and sigmoid function.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.