Search Results for author: Simon Ostermann

Found 12 papers, 0 papers with code

Where exactly does contextualization in a PLM happen?

no code implementations11 Dec 2023 Soniya Vijayakumar, Tanja Bäumel, Simon Ostermann, Josef van Genabith

Pre-trained Language Models (PLMs) have shown to be consistently successful in a plethora of NLP tasks due to their ability to learn contextualized representations of words (Ethayarajh, 2019).

Language Modelling Sentence +1

VM Image Repository and Distribution Models for Federated Clouds: State of the Art, Possible Directions and Open Issues

no code implementations21 Jun 2019 Nishant Saurabh, Dragi Kimovski, Simon Ostermann, Radu Prodan

The emerging trend of Federated Cloud models enlist virtualization as a significant concept to offer a large scale distributed Infrastructure as a Service collaborative paradigm to end users.

Distributed, Parallel, and Cluster Computing

MCScript2.0: A Machine Comprehension Corpus Focused on Script Events and Participants

no code implementations SEMEVAL 2019 Simon Ostermann, Michael Roth, Manfred Pinkal

Half of the questions cannot be answered from the reading texts, but require the use of commonsense and, in particular, script knowledge.

Reading Comprehension

MCScript: A Novel Dataset for Assessing Machine Comprehension Using Script Knowledge

no code implementations LREC 2018 Simon Ostermann, Ashutosh Modi, Michael Roth, Stefan Thater, Manfred Pinkal

We introduce a large dataset of narrative texts and questions about these texts, intended to be used in a machine comprehension task that requires reasoning using commonsense knowledge.

Natural Language Understanding Reading Comprehension

Aligning Script Events with Narrative Texts

no code implementations SEMEVAL 2017 Simon Ostermann, Michael Roth, Stefan Thater, Manfred Pinkal

Script knowledge plays a central role in text understanding and is relevant for a variety of downstream tasks.

Cannot find the paper you are looking for? You can Submit a new open access paper.