no code implementations • 10 Aug 2019 • Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji
Activation functions play a key role in providing remarkable performance in deep neural networks, and the rectified linear unit (ReLU) is one of the most widely used activation functions.
no code implementations • 17 Aug 2019 • Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji
In this paper, we investigate the effect of different hyperparameters as well as different combinations of hyperparameters settings on the performance of the Attention-Gated Convolutional Neural Networks (AGCNNs), e. g., the kernel window size, the number of feature maps, the keep rate of the dropout layer, and the activation function.