no code implementations • 3 Nov 2023 • Lukas Edman, Lisa Bylinina
This paper details the work of the University of Groningen for the BabyLM Challenge.
1 code implementation • 4 Jun 2023 • Aleksey Tikhonov, Lisa Bylinina, Denis Paperno
Multimodal embeddings aim to enrich the semantic information in neural representations of language compared to text-only models.
1 code implementation • 13 Sep 2021 • Lisa Bylinina, Alexey Tikhonov, Ekaterina Garmash
We investigate a new linguistic generalization in pre-trained language models (taking BERT (Devlin et al., 2019) as a case study).
1 code implementation • ACL 2022 • Lisa Bylinina, Alexey Tikhonov
Establishing this allows us to more adequately evaluate the performance of language models and also to use language models to discover new insights into natural language grammar beyond existing linguistic theories.