A CNN BiLSTM is a hybrid bidirectional LSTM and CNN architecture. In the original formulation applied to named entity recognition, it learns both character-level and word-level features. The CNN component is used to induce the character-level features. For each word the model employs a convolution and a max pooling layer to extract a new feature vector from the per-character feature vectors such as character embeddings and (optionally) character type.
Source: Named Entity Recognition with Bidirectional LSTM-CNNsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Named Entity Recognition (NER) | 6 | 13.04% |
Dependency Parsing | 2 | 4.35% |
Feature Engineering | 2 | 4.35% |
Graph Representation Learning | 1 | 2.17% |
Link Prediction | 1 | 2.17% |
Language Modelling | 1 | 2.17% |
Semantic Textual Similarity | 1 | 2.17% |
Sentence Embedding | 1 | 2.17% |
Transportation Mode Detection | 1 | 2.17% |