no code implementations • 8 Mar 2024 • Lorenzo Lupo, Paul Bose, Mahyar Habibi, Dirk Hovy, Carlo Schwarz
DADIT enables us to train and compare the performance of various state-of-the-art models for the prediction of the gender and age of social media users.
1 code implementation • 20 Nov 2023 • Lorenzo Lupo, Oscar Magnusson, Dirk Hovy, Elin Naurin, Lena Wängnerud
Recent advances in large language models (LLMs) like GPT-3 and GPT-4 have opened up new opportunities for text analysis in political science.
1 code implementation • 13 Feb 2023 • Lorenzo Lupo, Marco Dinarelli, Laurent Besacier
Context-aware translation can be achieved by processing a concatenation of consecutive sentences with the standard Transformer architecture.
1 code implementation • 24 Oct 2022 • Lorenzo Lupo, Marco Dinarelli, Laurent Besacier
A straightforward approach to context-aware neural machine translation consists in feeding the standard encoder-decoder architecture with a window of consecutive sentences, formed by the current sentence and a number of sentences from its context concatenated to it.
1 code implementation • ACL 2022 • Lorenzo Lupo, Marco Dinarelli, Laurent Besacier
Multi-encoder models are a broad family of context-aware neural machine translation systems that aim to improve translation quality by encoding document-level contextual information alongside the current sentence.