no code implementations • 2 Feb 2022 • Markus J. Hofmann, Steffen Remus, Chris Biemann, Ralph Radach, Lars Kuchinke
(3) In recurrent neural networks (RNNs), the subsymbolic units are trained to predict the next word, given all preceding words in the sentences.
no code implementations • LREC 2020 • Varvara Logacheva, Denis Teslenko, Artem Shelmanov, Steffen Remus, Dmitry Ustalov, Andrey Kutuzov, Ekaterina Artemova, Chris Biemann, Simone Paolo Ponzetto, Alexander Panchenko
We use this method to induce a collection of sense inventories for 158 languages on the basis of the original pre-trained fastText word embeddings by Grave et al. (2018), enabling WSD in these languages.
1 code implementation • 23 Sep 2019 • Gregor Wiedemann, Steffen Remus, Avi Chawla, Chris Biemann
Since vectors of the same word type can vary depending on the respective context, they implicitly provide a model for word sense disambiguation (WSD).
1 code implementation • ACL 2019 • Rami Aly, Steffen Remus, Chris Biemann
Capsule networks have been shown to demonstrate good performance on structured data in the area of visual inference.
1 code implementation • NAACL 2019 • Tim Fischer, Steffen Remus, Chris Biemann
Particularly for dynamic systems, where topics are not predefined but formulated as a search query, we believe a more informative approach is to perform user studies for directly comparing different methods in the same view.
no code implementations • RANLP 2017 • Seid Muhie Yimam, Steffen Remus, Alex Panchenko, er, Andreas Holzinger, Chris Biemann
In this paper, we describe the concept of entity-centric information access for the biomedical domain.
no code implementations • SEMEVAL 2016 • Alex Panchenko, er, Stefano Faralli, Eugen Ruppert, Steffen Remus, Hubert Naets, C{\'e}drick Fairon, Simone Paolo Ponzetto, Chris Biemann
no code implementations • LREC 2016 • Steffen Remus, Chris Biemann
This work presents a straightforward method for extending or creating in-domain web corpora by focused webcrawling.