Search Results for author: Àlex R. Atrio

Found 3 papers, 0 papers with code

The IICT-Yverdon System for the WMT 2021 Unsupervised MT and Very Low Resource Supervised MT Task

no code implementations WMT (EMNLP) 2021 Àlex R. Atrio, Gabriel Luthier, Axel Fahy, Giorgos Vernikos, Andrei Popescu-Belis, Ljiljana Dolamic

We then present the application of this system to the 2021 task for low-resource supervised Upper Sorbian (HSB) to German translation, in both directions.

Translation

Small Batch Sizes Improve Training of Low-Resource Neural MT

no code implementations ICON 2021 Àlex R. Atrio, Andrei Popescu-Belis

We study the role of an essential hyper-parameter that governs the training of Transformers for neural machine translation in a low-resource setting: the batch size.

Machine Translation Translation

On the Effect of Word Order on Cross-lingual Sentiment Analysis

no code implementations13 Jun 2019 Àlex R. Atrio, Toni Badia, Jeremy Barnes

Current state-of-the-art models for sentiment analysis make use of word order either explicitly by pre-training on a language modeling objective or implicitly by using recurrent neural networks (RNNs) or convolutional networks (CNNs).

Cross-Lingual Sentiment Classification General Classification +4

Cannot find the paper you are looking for? You can Submit a new open access paper.