A Bidirectional LSTM, or biLSTM, is a sequence processing model that consists of two LSTMs: one taking the input in a forward direction, and the other in a backwards direction. BiLSTMs effectively increase the amount of information available to the network, improving the context available to the algorithm (e.g. knowing what words immediately follow and precede a word in a sentence).
Image Source: Modelling Radiological Language with Bidirectional Long Short-Term Memory Networks, Cornegruta et al
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Sentence | 69 | 7.23% |
Sentiment Analysis | 42 | 4.40% |
Named Entity Recognition (NER) | 39 | 4.08% |
Language Modelling | 34 | 3.56% |
NER | 34 | 3.56% |
Text Classification | 31 | 3.25% |
General Classification | 28 | 2.93% |
Classification | 25 | 2.62% |
Question Answering | 18 | 1.88% |