no code implementations • GermEval 2021 • Mina Schütz, Christoph Demus, Jonas Pitz, Nadine Probol, Melanie Siegel, Dirk Labudde
For this binary task, we propose three models: a German BERT transformer model; a multilayer perceptron, which was first trained in parallel on textual input and 14 additional linguistic features and then concatenated in an additional layer; and a multilayer perceptron with both feature types as input.
1 code implementation • NAACL (WOAH) 2022 • Christoph Demus, Jonas Pitz, Mina Schütz, Nadine Probol, Melanie Siegel, Dirk Labudde
In this work, we present a new publicly available offensive language dataset of 10. 278 German social media comments collected in the first half of 2021 that were annotated by in total six annotators.