Search Results for author: Yoshihiro Nagano

Found 6 papers, 2 papers with code

On the Surrogate Gap between Contrastive and Supervised Losses

1 code implementation6 Oct 2021 Han Bao, Yoshihiro Nagano, Kento Nozawa

Recent theoretical studies have attempted to explain the benefit of the large negative sample size by upper-bounding the downstream classification loss with the contrastive loss.

Classification Data Augmentation +1

Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks

no code implementations16 May 2021 Haruka Asanuma, Shiro Takagi, Yoshihiro Nagano, Yuki Yoshida, Yasuhiko Igarashi, Masato Okada

Teacher-student learning is a framework in which we introduce two neural networks: one neural network is a target function in supervised learning, and the other is a learning neural network.

Continual Learning

Localized Generations with Deep Neural Networks for Multi-Scale Structured Datasets

no code implementations25 Sep 2019 Yoshihiro Nagano, Shiro Takagi, Yuki Yoshida, Masato Okada

The local learning approach extracts semantic representations for these datasets by training the embedding model from scratch for each local neighborhood, respectively.

Meta-Learning

A Wrapped Normal Distribution on Hyperbolic Space for Gradient-Based Learning

1 code implementation8 Feb 2019 Yoshihiro Nagano, Shoichiro Yamaguchi, Yasuhiro Fujita, Masanori Koyama

Hyperbolic space is a geometry that is known to be well-suited for representation learning of data with an underlying hierarchical structure.

Representation Learning

Concept Formation and Dynamics of Repeated Inference in Deep Generative Models

no code implementations12 Dec 2017 Yoshihiro Nagano, Ryo Karakida, Masato Okada

Our study demonstrated that transient dynamics of inference first approaches a concept, and then moves close to a memory.

Image Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.