no code implementations • LREC 2020 • Yudai Kishimoto, Yugo Murawaki, Sadao Kurohashi
BERT, a neural network-based language model pre-trained on large corpora, is a breakthrough in natural language processing, significantly outperforming previous state-of-the-art models in numerous tasks.
General Classification Implicit Discourse Relation Classification +3
no code implementations • COLING 2018 • Yudai Kishimoto, Yugo Murawaki, Sadao Kurohashi
Identifying discourse relations that are not overtly marked with discourse connectives remains a challenging problem.
General Classification Implicit Discourse Relation Classification +4