Search Results for author: Yuichiro Wada

Found 6 papers, 2 papers with code

Denoising Cosine Similarity: A Theory-Driven Approach for Efficient Representation Learning

no code implementations19 Apr 2023 Takumi Nakagawa, Yutaro Sanada, Hiroki Waida, Yuhui Zhang, Yuichiro Wada, Kōsaku Takanashi, Tomonori Yamada, Takafumi Kanamori

To this end, inspired by recent works on denoising and the success of the cosine-similarity-based objective functions in representation learning, we propose the denoising Cosine-Similarity (dCS) loss.

Denoising Representation Learning

Fast and Multi-aspect Mining of Complex Time-stamped Event Streams

2 code implementations7 Mar 2023 Kota Nakamura, Yasuko Matsubara, Koki Kawabata, Yuhei Umeda, Yuichiro Wada, Yasushi Sakurai

Thanks to its concise but effective summarization, CubeScope can also detect the sudden appearance of anomalies and identify the types of anomalies that occur in practice.

Anomaly Detection Data Compression

Deep Clustering with a Constraint for Topological Invariance based on Symmetric InfoNCE

no code implementations6 Mar 2023 Yuhui Zhang, Yuichiro Wada, Hiroki Waida, Kaito Goto, Yusaku Hino, Takafumi Kanamori

To address the problem, we propose a constraint utilizing symmetric InfoNCE, which helps an objective of deep clustering method in the scenario train the model so as to be efficient for not only non-complex topology but also complex topology datasets.

Clustering Deep Clustering

Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization

1 code implementation9 Jun 2021 Léo Andeol, Yusei Kawakami, Yuichiro Wada, Takafumi Kanamori, Klaus-Robert Müller, Grégoire Montavon

However, common ML losses do not give strong guarantees on how consistently the ML model performs for different domains, in particular, whether the model performs well on a domain at the expense of its performance on another domain.

Cannot find the paper you are looking for? You can Submit a new open access paper.