Search Results for author: Shanbo Chu

Found 4 papers, 1 papers with code

Cold-Start and Interpretability: Turning Regular Expressions into Trainable Recurrent Neural Networks

no code implementations EMNLP 2020 Chengyue Jiang, Yinggong Zhao, Shanbo Chu, Libin Shen, Kewei Tu

On the other hand, symbolic rules such as regular expressions are interpretable, require no training, and often achieve decent accuracy; but rules cannot benefit from labeled data when available and hence underperform neural networks in rich-resource scenarios.

text-classification Text Classification

Learning Numeral Embedding

1 code implementation Findings of the Association for Computational Linguistics 2020 Chengyue Jiang, Zhonglin Nian, Kaihao Guo, Shanbo Chu, Yinggong Zhao, Libin Shen, Kewei Tu

Numeral embeddings represented in this manner can be plugged into existing word embedding learning approaches such as skip-gram for training.

Word Similarity

Learning Numeral Embeddings

no code implementations28 Dec 2019 Chengyue Jiang, Zhonglin Nian, Kaihao Guo, Shanbo Chu, Yinggong Zhao, Libin Shen, Kewei Tu

Numeral embeddings represented in this manner can be plugged into existing word embedding learning approaches such as skip-gram for training.

Word Similarity

Latent Dependency Forest Models

no code implementations8 Sep 2016 Shanbo Chu, Yong Jiang, Kewei Tu

Probabilistic modeling is one of the foundations of modern machine learning and artificial intelligence.

Cannot find the paper you are looking for? You can Submit a new open access paper.