Character-Level Translation with Self-attention

ACL 2020 Yingqiang GaoNikola I. NikolovYuhuang HuRichard H. R. Hahnloser

We explore the suitability of self-attention models for character-level neural machine translation. We test the standard transformer model, as well as a novel variant in which the encoder block combines information from nearby characters using convolutions... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.