1 code implementation • 18 Feb 2018 • Kento Nozawa, Masanari Kimura, Atsunori Kanemura
Embedding graph nodes into a vector space can allow the use of machine learning to e. g. predict node classes, but the study of node embedding algorithms is immature compared to the natural language processing field because of a diverse nature of graphs.
no code implementations • 3 Jul 2018 • Masanari Kimura, Takashi Yanagihara
The proposed method detects pixel-level micro anomalies with a high accuracy from 1024x1024 high resolution images which are actually used in an industrial scene.
no code implementations • 30 Apr 2019 • Masanari Kimura, Masayuki Tanaka
To tackle this problem, we exploit a multi-channel attention mechanism in feature space.
no code implementations • 7 May 2019 • Masanari Kimura, Masayuki Tanaka
To tackle this problem, we exploit a multi-channel attention mechanism in feature space.
no code implementations • 27 May 2019 • Masanari Kimura
We propose the novel framework for anomaly detection in images.
no code implementations • 12 Sep 2019 • Masanari Kimura, Masayuki Tanaka
Improving the interpretability of DNNs is one of the hot research topics.
no code implementations • 16 Oct 2019 • Masanari Kimura
Earthquakes and tropical cyclones cause the suffering of millions of people around the world every year.
no code implementations • 11 Jun 2020 • Masanari Kimura
Machine learning techniques are used in a wide range of domains.
1 code implementation • 8 Jul 2020 • Masanari Kimura, Ryohei Izawa
Machine learning models suffer from overfitting, which is caused by a lack of labeled data.
1 code implementation • 31 Mar 2021 • Masanari Kimura, Hideitsu Hino
The asymmetric skew divergence smooths one of the distributions by mixing it, to a degree determined by the parameter $\lambda$, with the other distribution.
2 code implementations • 30 Aug 2021 • Masanari Kimura, Takuma Nakamura, Yuki Saito
This paper addresses the problem of set-to-set matching, which involves matching two different sets of items based on some criteria, especially in the case of high-dimensional items like images.
no code implementations • 2 Mar 2022 • Waku Hatakeyama, Shirou Kawakita, Ryohei Izawa, Masanari Kimura
Detecting changes on the Earth, such as urban development, deforestation, or natural disaster, is one of the research fields that is attracting a great deal of attention.
no code implementations • 22 Jun 2022 • Masanari Kimura, Hideitsu Hino
Dropout is one of the most popular regularization techniques in neural network training.
no code implementations • 28 Oct 2022 • Ryotaro Shimizu, Masanari Kimura, Masayuki Goto
Several techniques to map various types of components, such as words, attributes, and images, into the embedded space have been studied.
no code implementations • 25 Feb 2023 • Masanari Kimura
The problem of matching two sets of multiple elements, namely set-to-set matching, has received a great deal of attention in recent years.
1 code implementation • 19 Apr 2023 • Masanari Kimura, Hideitsu Hino
In particular, the phenomenon that the marginal distribution of the data changes is called covariate shift, one of the most important research topics in machine learning.
no code implementations • 10 Feb 2024 • Masanari Kimura
Test-Time Augmentation (TTA) is a very powerful heuristic that takes advantage of data augmentation during testing to produce averaged output.
no code implementations • 15 Mar 2024 • Masanari Kimura, Hideitsu Hino
Importance weighting is a fundamental procedure in statistics and machine learning that weights the objective function or probability distribution based on the importance of the instance in some sense.
no code implementations • 26 Mar 2024 • Masanari Kimura, Ryotaro Shimizu, Yuki Hirakawa, Ryosuke Goto, Yuki Saito
From these observations, we show that Deep Sets, one of the well-known permutation-invariant neural networks, can be generalized in the sense of a quasi-arithmetic mean.
no code implementations • 1 May 2024 • Masanari Kimura, Hiroki Naganuma
One of the simplest techniques to tackle this task is focal loss, a generalization of cross-entropy by introducing one positive parameter.