CogALex-VI Shared Task: Transrelation - A Robust Multilingual Language Model for Multilingual Relation Identification

We describe our submission to the CogALex-VI shared task on the identification of multilingual paradigmatic relations building on XLM-RoBERTa (XLM-R), a robustly optimized and multilingual BERT model. In spite of several experiments with data augmentation, data addition and ensemble methods with a Siamese Triple Net, Translrelation, the XLM-R model with a linear classifier adapted to this specific task, performed best in testing and achieved the best results in the final evaluation of the shared task, even for a previously unseen language...

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
WordPiece
Subword Segmentation
Softmax
Output Functions
GELU
Activation Functions
Linear Warmup With Linear Decay
Learning Rate Schedules
Adam
Stochastic Optimization
Layer Normalization
Normalization
Scaled Dot-Product Attention
Attention Mechanisms
Dropout
Regularization
Weight Decay
Regularization
Dense Connections
Feedforward Networks
Multi-Head Attention
Attention Modules
Attention Dropout
Regularization
Residual Connection
Skip Connections
BERT
Language Models