Search Results for author: Alex Rush

Found 15 papers, 3 papers with code

Debugging Sequence-to-Sequence Models with Seq2Seq-Vis

no code implementations WS 2018 Hendrik Strobelt, Sebastian Gehrmann, Michael Behrisch, Adam Perer, Hanspeter Pfister, Alex Rush, er

Neural attention-based sequence-to-sequence models (seq2seq) (Sutskever et al., 2014; Bahdanau et al., 2014) have proven to be accurate and robust for many sequence prediction tasks.

Attribute Translation

The Annotated Transformer

1 code implementation WS 2018 Alex Rush, er

A major goal of open-source NLP is to quickly and accurately reproduce the results of new work, in a manner that the community can easily use and modify.

Coarse-to-Fine Attention Models for Document Summarization

no code implementations WS 2017 Jeffrey Ling, Alex Rush, er

Sequence-to-sequence models with attention have been successful for a variety of NLP problems, but their speed does not scale well for tasks with long source sequences such as document summarization.

Document Summarization Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.