no code implementations • ICLR 2018 • Rui Liu, Wei Wei, Weiguang Mao, Maria Chikina
Attention models have been intensively studied to improve NLP tasks such as machine comprehension via both question-aware passage attention model and self-matching attention model.
Ranked #31 on Question Answering on SQuAD1.1 dev