Search Results for author: Luis Espinosa Anke

Found 18 papers, 7 papers with code

Combining BERT with Static Word Embeddings for Categorizing Social Media

no code implementations EMNLP (WNUT) 2020 Israa Alghanmi, Luis Espinosa Anke, Steven Schockaert

A particularly striking example is the performance of AraBERT, an LM for the Arabic language, which is successful in categorizing social media posts in Arabic dialects, despite only having been trained on Modern Standard Arabic.

Word Embeddings

Definition Extraction Feature Analysis: From Canonical to Naturally-Occurring Definitions

no code implementations COLING (CogALex) 2020 Mireia Roig Mirapeix, Luis Espinosa Anke, Jose Camacho-Collados

Textual definitions constitute a fundamental source of knowledge when seeking the meaning of words, and they are the cornerstone of lexical resources like glossaries, dictionaries, encyclopedia or thesauri.

Definition Extraction

Pre-Training Language Models for Identifying Patronizing and Condescending Language: An Analysis

no code implementations LREC 2022 Carla Perez Almendros, Luis Espinosa Anke, Steven Schockaert

Patronizing and Condescending Language (PCL) is a subtle but harmful type of discourse, yet the task of recognizing PCL remains under-studied by the NLP community.

CollFrEn: Rich Bilingual English–French Collocation Resource

1 code implementation COLING (MWE) 2020 Beatriz Fisas, Luis Espinosa Anke, Joan Codina-Filbá, Leo Wanner

Collocations in the sense of idiosyncratic lexical co-occurrences of two syntactically bound words traditionally pose a challenge to language learners and many Natural Language Processing (NLP) applications alike.

Machine Translation Relation Classification +3

Tweet Insights: A Visualization Platform to Extract Temporal Insights from Twitter

no code implementations4 Aug 2023 Daniel Loureiro, Kiamehr Rezaee, Talayeh Riahi, Francesco Barbieri, Leonardo Neves, Luis Espinosa Anke, Jose Camacho-Collados

This paper introduces a large collection of time series data derived from Twitter, postprocessed using word embedding techniques, as well as specialized fine-tuned language models.

Time Series

Distilling Hypernymy Relations from Language Models: On the Effectiveness of Zero-Shot Taxonomy Induction

no code implementations *SEM (NAACL) 2022 Devansh Jain, Luis Espinosa Anke

In this paper, we analyze zero-shot taxonomy learning methods which are based on distilling knowledge from language models via prompting and sentence scoring.

Sentence

Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection

1 code implementation ACL (RepL4NLP) 2021 Yixiao Wang, Zied Bouraoui, Luis Espinosa Anke, Steven Schockaert

Second, rather than learning a word vector directly, we use a topic model to partition the contexts in which words appear, and then learn different topic-specific vectors for each word.

Sentence Word Embeddings

Evaluating language models for the retrieval and categorization of lexical collocations

1 code implementation EACL 2021 Luis Espinosa Anke, Joan Codina-Filba, Leo Wanner

We first construct a dataset of apparitions of lexical collocations in context, categorized into 17 representative semantic categories.

Retrieval valid

Collocation Classification with Unsupervised Relation Vectors

1 code implementation ACL 2019 Luis Espinosa Anke, Steven Schockaert, Leo Wanner

Lexical relation classification is the task of predicting whether a certain relation holds between a given pair of words.

Classification General Classification +3

Example-based Acquisition of Fine-grained Collocation Resources

no code implementations LREC 2016 Sara Rodr{\'\i}guez-Fern{\'a}ndez, Roberto Carlini, Luis Espinosa Anke, Leo Wanner

Collocations such as {``}heavy rain{''} or {``}make [a] decision{''}, are combinations of two elements where one (the base) is freely chosen, while the choice of the other (collocate) is restricted, depending on the base.

Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.