Search Results for author: Matej Ulčar

Found 11 papers, 2 papers with code

EMBEDDIA hackathon report: Automatic sentiment and viewpoint analysis of Slovenian news corpus on the topic of LGBTIQ+

no code implementations EACL (Hackashop) 2021 Matej Martinc, Nina Perger, Andraž Pelicon, Matej Ulčar, Andreja Vezovnik, Senja Pollak

We conduct automatic sentiment and viewpoint analysis of the newly created Slovenian news corpus containing articles related to the topic of LGBTIQ+ by employing the state-of-the-art news sentiment classifier and a system for semantic change detection.

Change Detection

Training dataset and dictionary sizes matter in BERT models: the case of Baltic languages

no code implementations20 Dec 2021 Matej Ulčar, Marko Robnik-Šikonja

To analyze the importance of focusing on a single language and the importance of a large training set, we compare created models with existing monolingual and multilingual BERT models for Estonian, Latvian, and Lithuanian.

Dependency Parsing named-entity-recognition +3

Evaluation of contextual embeddings on less-resourced languages

no code implementations22 Jul 2021 Matej Ulčar, Aleš Žagar, Carlos S. Armendariz, Andraž Repar, Senja Pollak, Matthew Purver, Marko Robnik-Šikonja

The current dominance of deep neural networks in natural language processing is based on contextual embeddings such as ELMo, BERT, and BERT derivatives.

Dependency Parsing

Cross-lingual alignments of ELMo contextual embeddings

no code implementations30 Jun 2021 Matej Ulčar, Marko Robnik-Šikonja

Building machine learning prediction models for a specific NLP task requires sufficient training data, which can be difficult to obtain for less-resourced languages.

Dependency Parsing named-entity-recognition +4

FinEst BERT and CroSloEngual BERT: less is more in multilingual models

no code implementations14 Jun 2020 Matej Ulčar, Marko Robnik-Šikonja

Large pretrained masked language models have become state-of-the-art solutions for many NLP problems.

Dependency Parsing NER +3

High Quality ELMo Embeddings for Seven Less-Resourced Languages

no code implementations22 Nov 2019 Matej Ulčar, Marko Robnik-Šikonja

Recent results show that deep neural networks using contextual embeddings significantly outperform non-contextual embeddings on a majority of text classification task.

NER text-classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.