Neural Metaphor Detection with Visibility Embeddings
We present new results for the problem of sequence metaphor labeling, using the recently developed Visibility Embeddings. We show that concatenating such embeddings to the input of a BiLSTM obtains consistent and significant improvements at almost no cost, and we present further improved results when visibility embeddings are combined with BERT.
PDF AbstractTasks
Datasets
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Adam •
Attention Dropout •
BERT •
BiLSTM •
Dense Connections •
Dropout •
GELU •
Layer Normalization •
Linear Layer •
Linear Warmup With Linear Decay •
LSTM •
Multi-Head Attention •
Residual Connection •
Scaled Dot-Product Attention •
Sigmoid Activation •
Softmax •
Tanh Activation •
Weight Decay •
WordPiece