2 code implementations • LREC 2022 • José Cañete, Sebastián Donoso, Felipe Bravo-Marquez, Andrés Carvallo, Vladimir Araujo
In this paper we present ALBETO and DistilBETO, which are versions of ALBERT and DistilBERT pre-trained exclusively on Spanish corpora.
1 code implementation • LREC 2022 • Vladimir Araujo, Andrés Carvallo, Souvik Kundu, José Cañete, Marcelo Mendoza, Robert E. Mercer, Felipe Bravo-Marquez, Marie-Francine Moens, Alvaro Soto
Due to the success of pre-trained language models, versions of languages other than English have been released in recent years.
1 code implementation • NAACL (BioNLP) 2021 • Vladimir Araujo, Andrés Carvallo, Carlos Aspillaga, Camilo Thorne, Denis Parra
The success of pretrained word embeddings has motivated their use in the biomedical domain, with contextualized embeddings yielding remarkable results in several biomedical NLP tasks.
no code implementations • 7 Jul 2021 • Iván Cantador, Andrés Carvallo, Fernando Diez, Denis Parra
Furthermore, we also provide examples of the applicability of recommendations utilizing aspect opinions as explanations in a visualization dashboard, which allows obtaining information about the most and least liked aspects of similar users obtained from the embeddings of an input graph.
no code implementations • 7 Jul 2021 • Iván Cantador, Andrés Carvallo, Fernando Diez
The success of neural network embeddings has entailed a renewed interest in using knowledge graphs for a wide variety of machine learning and information retrieval tasks.
no code implementations • LREC 2020 • Carlos Aspillaga, Andrés Carvallo, Vladimir Araujo
There has been significant progress in recent years in the field of Natural Language Processing thanks to the introduction of the Transformer architecture.
Natural Language Inference
Natural Language Understanding
+1