no code implementations • EACL (GWC) 2021 • Arkadiusz Janz, Maciej Piasecki, Piotr Wątorski
Neural language models, including transformer-based models, that are pre-trained on very large corpora became a common way to represent text in various tasks, including recognition of textual semantic relations, e. g. Cross-document Structure Theory.