no code implementations • 4 Mar 2025 • Paul Suganthan, Fedor Moiseev, Le Yan, Junru Wu, Jianmo Ni, Jay Han, Imed Zitouni, Enrique Alfonseca, Xuanhui Wang, Zhe Dong
Decoder-based transformers, while revolutionizing language modeling and scaling to immense sizes, have not completely overtaken encoder-heavy architectures in natural language processing.
no code implementations • 5 Jun 2023 • Fedor Moiseev, Gustavo Hernandez Abrego, Peter Dornbach, Imed Zitouni, Enrique Alfonseca, Zhe Dong
Dual encoders have been used for retrieval tasks and representation learning with good results.
no code implementations • NAACL 2022 • Fedor Moiseev, Zhe Dong, Enrique Alfonseca, Martin Jaggi
The models pre-trained on factual triples compare competitively with the ones on natural language sentences that contain the same knowledge.
1 code implementation • 14 Apr 2022 • Zhe Dong, Jianmo Ni, Daniel M. Bikel, Enrique Alfonseca, YuAn Wang, Chen Qu, Imed Zitouni
We further explore and explain why parameter sharing in projection layer significantly improves the efficacy of the dual encoders, by directly probing the embedding spaces of the two encoder towers with t-SNE algorithm.
2 code implementations • 11 Jun 2018 • Aleksandr Chuklin, Aliaksei Severyn, Johanne Trippas, Enrique Alfonseca, Hanna Silen, Damiano Spina
Many popular form factors of digital assistants---such as Amazon Echo, Apple Homepod, or Google Home---enable the user to hold a conversation with these systems based only on the speech modality.
no code implementations • 21 Apr 2018 • Ondřej Cífka, Aliaksei Severyn, Enrique Alfonseca, Katja Filippova
In this paper, we study recent neural generative models for text generation related to variational autoencoders.
no code implementations • 11 Aug 2017 • Mostafa Dehghani, Sascha Rothe, Enrique Alfonseca, Pascal Fleury
e results suggest that our model outperforms the baselines both in terms of the generating queries and scoring candidate queries for the task of query suggestion.
no code implementations • 28 Oct 2015 • Katja Filippova, Enrique Alfonseca
A popular approach to sentence compression is to formulate the task as a constrained optimization problem and solve it with integer linear programming (ILP) tools.