Recurrent neural networks (RNNs), such as long short-term memory networks (LSTMs), serve as a fundamental building block for many sequence learning tasks, including machine translation, language modeling, and question answering.
#8 best model for Language Modelling on WikiText-2
Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance.
One direction is to define a more composite representation for questions and answers by combining convolutional neural network with the basic framework.
Named Entity Recognition (NER) is one of the most common tasks of the natural language processing.
State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.
#7 best model for Part-Of-Speech Tagging on Penn Treebank
It can also use sentence level tag information thanks to a CRF layer.