Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance.
One direction is to define a more composite representation for questions and answers by combining convolutional neural network with the basic framework.
Recurrent neural networks (RNNs), such as long short-term memory networks (LSTMs), serve as a fundamental building block for many sequence learning tasks, including machine translation, language modeling, and question answering.
#6 best model for Language Modelling on WikiText-2
State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.
#5 best model for Part-Of-Speech Tagging on Penn Treebank
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon.
In this paper we show that reporting a single performance score is insufficient to compare non-deterministic approaches.