2 code implementations • ACL 2021 • Rachit Bansal, Himanshu Choudhary, Ravneet Punia, Niko Schenk, Jacob L Dahl, Émilie Pagé-Perron
Despite the recent advancements of attention-based deep learning architectures across a majority of Natural Language Processing tasks, their application remains limited in a low-resource setting because of a lack of pre-trained models for such languages.
no code implementations • COLING 2020 • Ravneet Punia, Niko Schenk, Christian Chiarcos, {\'E}milie Pag{\'e}-Perron
The Sumerian cuneiform script was invented more than 5, 000 years ago and represents one of the oldest in history.
Cultural Vocal Bursts Intensity Prediction Machine Translation +2
no code implementations • LREC 2020 • Christian Chiarcos, Niko Schenk, Christian F{\"a}th
We describe an approach on translation inference based on symbolic methods, the propagation of concepts over a graph of interconnected dictionaries: Given a mapping from source language words to lexical concepts (e. g., synsets) as a seed, we use bilingual dictionaries to extrapolate a mapping of pivot and target language words to these lexical concepts.
no code implementations • ACL 2017 • Samuel Rönnqvist, Niko Schenk, Christian Chiarcos
We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches.
no code implementations • WS 2017 • Niko Schenk, Christian Chiarcos
We present a resource-lean neural recognizer for modeling coherence in commonsense stories.