Search Results for author: Andrey Vasnev

Found 4 papers, 1 papers with code

Adaptive Hierarchical Hyper-gradient Descent

no code implementations17 Aug 2020 Renlong Jie, Junbin Gao, Andrey Vasnev, Minh-Ngoc Tran

In this study, we investigate learning rate adaption at different levels based on the hyper-gradient descent framework and propose a method that adaptively learns the optimizer parameters by combining multiple levels of learning rates with hierarchical structures.

Meta-Learning

Regularized Flexible Activation Function Combinations for Deep Neural Networks

no code implementations26 Jul 2020 Renlong Jie, Junbin Gao, Andrey Vasnev, Min-ngoc Tran

Based on this, a novel family of flexible activation functions that can replace sigmoid or tanh in LSTM cells are implemented, as well as a new family by combining ReLU and ELUs.

Image Compression Philosophy +2

COMBINED FLEXIBLE ACTIVATION FUNCTIONS FOR DEEP NEURAL NETWORKS

no code implementations25 Sep 2019 Renlong Jie, Junbin Gao, Andrey Vasnev, Minh-Ngoc Tran

Based on this, we develop two novel flexible activation functions that can be implemented in LSTM cells and auto-encoder layers.

Image Classification Philosophy +2

Cannot find the paper you are looking for? You can Submit a new open access paper.