Search Results for author: Maria Chikina

Found 2 papers, 0 papers with code

Phase Conductor on Multi-layered Attentions for Machine Comprehension

no code implementations ICLR 2018 Rui Liu, Wei Wei, Weiguang Mao, Maria Chikina

Attention models have been intensively studied to improve NLP tasks such as machine comprehension via both question-aware passage attention model and self-matching attention model.

Question Answering Reading Comprehension

Modeling strict age-targeted mitigation strategies for COVID-19

no code implementations8 Apr 2020 Maria Chikina, Wesley Pegden

We use a simple SIR-like epidemic model which integrates known age-contact patterns for the United States to model the effect of age-targeted mitigation strategies for a COVID-19-like epidemic.

Cannot find the paper you are looking for? You can Submit a new open access paper.