Exact Hard Monotonic Attention for Character-Level Transduction

ACL 2019 Shijie WuRyan Cotterell

Many common character-level, string-to-string transduction tasks, e.g. graphemeto-phoneme conversion and morphological inflection, consist almost exclusively of monotonic transduction. Neural sequence-to-sequence models with soft attention, which are non-monotonic, often outperform popular monotonic models... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.