Deep Tabular Learning

Bidirectional LSTM

A Bidirectional LSTM, or biLSTM, is a sequence processing model that consists of two LSTMs: one taking the input in a forward direction, and the other in a backwards direction. BiLSTMs effectively increase the amount of information available to the network, improving the context available to the algorithm (e.g. knowing what words immediately follow and precede a word in a sentence).

Image Source: Modelling Radiological Language with Bidirectional Long Short-Term Memory Networks, Cornegruta et al

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 41 4.82%
Sentiment Analysis 40 4.71%
Named Entity Recognition (NER) 36 4.24%
General Classification 33 3.88%
NER 32 3.76%
Text Classification 29 3.41%
Classification 23 2.71%
Question Answering 20 2.35%
Dependency Parsing 18 2.12%

Components


Component Type
LSTM
Recurrent Neural Networks

Categories