Search Results for author: Hideki Oki

Found 2 papers, 1 papers with code

Triplet Loss for Knowledge Distillation

1 code implementation17 Apr 2020 Hideki Oki, Motoshi Abe, Junichi Miyao, Takio Kurita

The functionality of the metric learning to reduce the differences between similar outputs can be used for the knowledge distillation to reduce the differences between the outputs of the teacher model and the student model.

Knowledge Distillation Metric Learning

Mixup of Feature Maps in a Hidden Layer for Training of Convolutional Neural Network

no code implementations24 Jun 2019 Hideki Oki, Takio Kurita

However, the recognition accuracy of the trained deep CNN drastically decreases for the samples which are obtained from the outside regions of the training samples.

Data Augmentation Image Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.