Language model pretraining has led to significant performance gains but careful comparison between different approaches is challenging.
SOTA for Question Answering on SQuAD2.0 dev (using extra training data)
Unlike the previously proposed automated TS systems, our neural text simplification (NTS) systems are able to simultaneously perform lexical simplification and content reduction.
Our adversarial post-specialization method propagates the external lexical knowledge to the full distributional space.
Lexical simplification (LS) aims to replace complex words in a given sentence with their simpler alternatives of equivalent meaning.