Global Autoregressive Models for Data-Efficient Sequence Learning

CONLL 2019 Tetiana ParshakovaJean-Marc AndreoliMarc Dymetman

Standard autoregressive seq2seq models are easily trained by max-likelihood, but tend to show poor results under small-data conditions. We introduce a class of seq2seq models, GAMs (Global Autoregressive Models), which combine an autoregressive component with a log-linear component, allowing the use of global \textit{a priori} features to compensate for lack of data... (read more)

PDF Abstract CONLL 2019 PDF CONLL 2019 Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper