A Sensitivity Analysis of Attention-Gated Convolutional Neural Networks for Sentence Classification

17 Aug 2019  ·  Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji ·

In this paper, we investigate the effect of different hyperparameters as well as different combinations of hyperparameters settings on the performance of the Attention-Gated Convolutional Neural Networks (AGCNNs), e.g., the kernel window size, the number of feature maps, the keep rate of the dropout layer, and the activation function. We draw practical advice from a wide range of empirical results. Through the sensitivity analysis, we further improve the hyperparameters settings of AGCNNs. Experiments show that our proposals could achieve an average of 0.81% and 0.67% improvements on AGCNN-NLReLU-rand and AGCNN-SELU-rand, respectively; and an average of 0.47% and 0.45% improvements on AGCNN-NLReLU-static and AGCNN-SELU-static, respectively.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods