Strong Baselines for Neural Semi-supervised Learning under Domain Shift

ACL 2018  ·  Sebastian Ruder, Barbara Plank ·

Novel neural models have been proposed in recent years for learning under domain shift. Most models, however, only evaluate on a single task, on proprietary datasets, or compare to weak baselines, which makes comparison of models difficult. In this paper, we re-evaluate classic general-purpose bootstrapping approaches in the context of neural networks under domain shifts vs. recent neural approaches and propose a novel multi-task tri-training method that reduces the time and space complexity of classic tri-training. Extensive experiments on two benchmarks are negative: while our novel method establishes a new state-of-the-art for sentiment analysis, it does not fare consistently the best. More importantly, we arrive at the somewhat surprising conclusion that classic tri-training, with some additions, outperforms the state of the art. We conclude that classic approaches constitute an important and strong baseline.

PDF Abstract ACL 2018 PDF ACL 2018 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Sentiment Analysis Multi-Domain Sentiment Dataset Multi-task tri-training DVD 78.14 # 3
Books 74.86 # 3
Electronics 81.45 # 2
Kitchen 82.14 # 5
Average 79.15 # 3

Methods


No methods listed for this paper. Add relevant methods here