Search Results for author: Jinghua Qu

Found 2 papers, 0 papers with code

A Sensitivity Analysis of Attention-Gated Convolutional Neural Networks for Sentence Classification

no code implementations17 Aug 2019 Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji

In this paper, we investigate the effect of different hyperparameters as well as different combinations of hyperparameters settings on the performance of the Attention-Gated Convolutional Neural Networks (AGCNNs), e. g., the kernel window size, the number of feature maps, the keep rate of the dropout layer, and the activation function.

General Classification Sentence +1

Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks

no code implementations10 Aug 2019 Yang Liu, Jianpeng Zhang, Chao GAO, Jinghua Qu, Lixin Ji

Activation functions play a key role in providing remarkable performance in deep neural networks, and the rectified linear unit (ReLU) is one of the most widely used activation functions.

Cannot find the paper you are looking for? You can Submit a new open access paper.