1 code implementation • EMNLP (newsum) 2021 • Annette Rios, Nicolas Spring, Tannon Kew, Marek Kostrzewa, Andreas Säuberli, Mathias Müller, Sarah Ebling
The task of document-level text simplification is very similar to summarization with the additional difficulty of reducing complexity.
no code implementations • 4 Nov 2024 • Alessandro Mastrototaro, Mathias Müller, Jimmy Olsson
General state-space models (SSMs) are widely used in statistical machine learning and are among the most classical generative models for sequential time-series data.
1 code implementation • 1 Jul 2024 • Zifan Jiang, Gerard Sant, Amit Moryossef, Mathias Müller, Rico Sennrich, Sarah Ebling
We present SignCLIP, which re-purposes CLIP (Contrastive Language-Image Pretraining) to project spoken language text and sign language videos, two classes of natural languages of distinct modalities, into the same space.
1 code implementation • 16 Nov 2023 • Shester Gueuwou, Sophie Siake, Colin Leong, Mathias Müller
Advancements in sign language processing have been hindered by a lack of sufficient data, impeding progress in recognition, translation, and production tasks.
1 code implementation • 21 Oct 2023 • Amit Moryossef, Zifan Jiang, Mathias Müller, Sarah Ebling, Yoav Goldberg
We find that introducing BIO tagging is necessary to model sign boundaries.
1 code implementation • 13 Oct 2023 • Amit Moryossef, Mathias Müller, Rebecka Fahrni
The library includes a specialized file format that encapsulates various types of pose data, accommodating multiple individuals and an indefinite number of time frames, thus proving its utility for both image and video data.
1 code implementation • 15 Jun 2023 • Eric Egli, Noah Mamié, Eyal Liron Dolev, Mathias Müller
We then use our best model to analyze the stance of utterances extracted from the Swiss federal voting booklet concerning the Swiss popular votes of September 2022, which is the main goal of this project.
2 code implementations • 28 May 2023 • Amit Moryossef, Mathias Müller, Anne Göhring, Zifan Jiang, Yoav Goldberg, Sarah Ebling
Sign language translation systems are complex and require many components.
1 code implementation • International Conference on Learning Representations (ICLR) 2023 • Biao Zhang, Mathias Müller, Rico Sennrich
We propose SLTUNET, a simple unified neural model designed to support multiple SLTrelated tasks jointly, such as sign-to-gloss, gloss-to-text and sign-to-text translation.
no code implementations • 28 Nov 2022 • Mathias Müller, Zifan Jiang, Amit Moryossef, Annette Rios, Sarah Ebling
Automatic sign language processing is gaining popularity in Natural Language Processing (NLP) research (Yin et al., 2021).
1 code implementation • 11 Oct 2022 • Zifan Jiang, Amit Moryossef, Mathias Müller, Sarah Ebling
This paper presents work on novel machine translation (MT) systems between spoken and signed languages, where signed languages are represented in SignWriting, a sign language writing system.
1 code implementation • ACL 2021 • Mathias Müller, Rico Sennrich
Neural Machine Translation (NMT) currently exhibits biases such as producing translations that are too short and overgenerating frequent words, and shows poor robustness to copy noise in training data or domain shift.
no code implementations • 20 Apr 2021 • Amit Moryossef, Ioannis Tsochantaridis, Joe Dinn, Necati Cihan Camgöz, Richard Bowden, Tao Jiang, Annette Rios, Mathias Müller, Sarah Ebling
Basically, skeletal representations generalize over an individual's appearance and background, allowing us to focus on the recognition of motion.
no code implementations • 22 Mar 2021 • Julia Kreutzer, Isaac Caswell, Lisa Wang, Ahsan Wahab, Daan van Esch, Nasanbayar Ulzii-Orshikh, Allahsera Tapo, Nishant Subramani, Artem Sokolov, Claytone Sikasote, Monang Setyawan, Supheakmungkol Sarin, Sokhar Samb, Benoît Sagot, Clara Rivera, Annette Rios, Isabel Papadimitriou, Salomey Osei, Pedro Ortiz Suarez, Iroro Orife, Kelechi Ogueji, Andre Niyongabo Rubungo, Toan Q. Nguyen, Mathias Müller, André Müller, Shamsuddeen Hassan Muhammad, Nanda Muhammad, Ayanda Mnyakeni, Jamshidbek Mirzakhalov, Tapiwanashe Matangira, Colin Leong, Nze Lawson, Sneha Kudugunta, Yacine Jernite, Mathias Jenny, Orhan Firat, Bonaventure F. P. Dossou, Sakhile Dlamini, Nisansa de Silva, Sakine Çabuk Ballı, Stella Biderman, Alessia Battisti, Ahmed Baruwa, Ankur Bapna, Pallavi Baljekar, Israel Abebe Azime, Ayodele Awokoya, Duygu Ataman, Orevaoghene Ahia, Oghenefego Ahia, Sweta Agrawal, Mofetoluwa Adeyemi
With the success of large-scale pre-training and multilingual modeling in Natural Language Processing (NLP), recent years have seen a proliferation of large, web-mined text datasets covering hundreds of languages.
1 code implementation • WMT (EMNLP) 2020 • Annette Rios, Mathias Müller, Rico Sennrich
A recent trend in multilingual models is to not train on parallel data between all language pairs, but have a single bridge language, e. g. English.
1 code implementation • 22 Jul 2020 • Dmitry Gordeev, Philipp Singer, Marios Michailidis, Mathias Müller, SriSatish Ambati
Our work studies the predictive performance of models at various stages of the pandemic to better understand their fundamental uncertainty and the impact of data availability on such forecasts.
2 code implementations • AMTA 2020 • Mathias Müller, Annette Rios, Rico Sennrich
Domain robustness---the generalization of models to unseen test domains---is low for both statistical (SMT) and neural machine translation (NMT).
1 code implementation • WS 2018 • Mathias Müller, Annette Rios, Elena Voita, Rico Sennrich
We show that, while gains in BLEU are moderate for those systems, they outperform baselines by a large margin in terms of accuracy on our contrastive test set.
1 code implementation • EMNLP 2018 • Gongbo Tang, Mathias Müller, Annette Rios, Rico Sennrich
Recently, non-recurrent architectures (convolutional, self-attentional) have outperformed RNNs in neural machine translation.