A Bidirectional LSTM, or biLSTM, is a sequence processing model that consists of two LSTMs: one taking the input in a forward direction, and the other in a backwards direction. BiLSTMs effectively increase the amount of information available to the network, improving the context available to the algorithm (e.g. knowing what words immediately follow and precede a word in a sentence).
Image Source: Modelling Radiological Language with Bidirectional Long Short-Term Memory Networks, Cornegruta et al
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Sentence | 62 | 5.99% |
Sentiment Analysis | 44 | 4.25% |
Named Entity Recognition (NER) | 38 | 3.67% |
NER | 34 | 3.29% |
Text Classification | 30 | 2.90% |
Language Modelling | 29 | 2.80% |
Language Modeling | 26 | 2.51% |
Classification | 22 | 2.13% |
General Classification | 22 | 2.13% |