Search Results for author: Bernd Möbius

Found 7 papers, 4 papers with code

Modeling the Impact of Syntactic Distance and Surprisal on Cross-Slavic Text Comprehension

no code implementations LREC 2022 Irina Stenger, Philip Georgis, Tania Avgustinova, Bernd Möbius, Dietrich Klakow

We focus on the syntactic variation and measure syntactic distances between nine Slavic languages (Belarusian, Bulgarian, Croatian, Czech, Polish, Slovak, Slovene, Russian, and Ukrainian) using symmetric measures of insertion, deletion and movement of syntactic units in the parallel sentences of the fable “The North Wind and the Sun”.

Cloze Test Reading Comprehension

incom.py 2.0 - Calculating Linguistic Distances and Asymmetries in Auditory Perception of Closely Related Languages

no code implementations RANLP 2021 Marius Mosbach, Irina Stenger, Tania Avgustinova, Bernd Möbius, Dietrich Klakow

We present an extended version of a tool developed for calculating linguistic distances and asymmetries in auditory perception of closely related languages.

regression

Integrating Form and Meaning: A Multi-Task Learning Model for Acoustic Word Embeddings

1 code implementation14 Sep 2022 Badr M. Abdullah, Bernd Möbius, Dietrich Klakow

Models of acoustic word embeddings (AWEs) learn to map variable-length spoken word segments onto fixed-dimensionality vector representations such that different acoustic exemplars of the same word are projected nearby in the embedding space.

Multi-Task Learning Word Embeddings

Do Acoustic Word Embeddings Capture Phonological Similarity? An Empirical Study

1 code implementation16 Jun 2021 Badr M. Abdullah, Marius Mosbach, Iuliia Zaitova, Bernd Möbius, Dietrich Klakow

Our experiments show that (1) the distance in the embedding space in the best cases only moderately correlates with phonological distance, and (2) improving the performance on the word discrimination task does not necessarily yield models that better reflect word phonological similarity.

Word Embeddings

Rediscovering the Slavic Continuum in Representations Emerging from Neural Models of Spoken Language Identification

no code implementations VarDial (COLING) 2020 Badr M. Abdullah, Jacek Kudera, Tania Avgustinova, Bernd Möbius, Dietrich Klakow

In this paper, we present a neural model for Slavic language identification in speech signals and analyze its emergent representations to investigate whether they reflect objective measures of language relatedness and/or non-linguists' perception of language similarity.

Language Identification Spoken language identification

Cross-Domain Adaptation of Spoken Language Identification for Related Languages: The Curious Case of Slavic Languages

1 code implementation2 Aug 2020 Badr M. Abdullah, Tania Avgustinova, Bernd Möbius, Dietrich Klakow

State-of-the-art spoken language identification (LID) systems, which are based on end-to-end deep neural networks, have shown remarkable success not only in discriminating between distant languages but also between closely-related languages or even different spoken varieties of the same language.

Language Identification Spoken language identification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.