no code implementations • 21 Apr 2015 • Lizhen Qu, Gabriela Ferraro, Liyuan Zhou, Weiwei Hou, Nathan Schneider, Timothy Baldwin
Word embeddings -- distributed word representations that can be learned from unlabelled data -- have been shown to have high utility in many natural language processing applications.
no code implementations • EMNLP 2016 • Lizhen Qu, Gabriela Ferraro, Liyuan Zhou, Weiwei Hou, Timothy Baldwin
In named entity recognition, we often don't have a large in-domain training corpus or a knowledge base with adequate coverage to train a model directly.
no code implementations • 23 Sep 2020 • Weiwei Hou, Hanna Suominen, Piotr Koniusz, Sabrina Caldwell, Tom Gedeon
Sentence compression is a Natural Language Processing (NLP) task aimed at shortening original sentences and preserving their key information.