Due to the success of pre-trained language models, versions of languages other than English have been released in recent years.
Current language models are usually trained using a self-supervised scheme, where the main focus is learning representations at the word or sentence level.
Our novel approach provides a summary that represents the most relevant aspects of a news item that users comment on, incorporating the social context as a source of information to summarize texts in online social networks.
The field of natural language understanding has experienced exponential progress in the last few years, with impressive results in several tasks.
We use Idf combinations of embeddings to represent queries, showing that these representations outperform the average word embeddings recently proposed in the literature.
The state of the art, previously dominated by pre-trained word embeddings, is now being pushed forward by large pre-trained contextual representation models.
We perform an automatic analysis of television news programs, based on the closed captions that accompany them.