Search Results for author: Rudolf Rosa

Found 39 papers, 6 papers with code

Eyes on the Parse: Using Gaze Features in Syntactic Parsing

no code implementations LANTERN (COLING) 2020 Abhishek Agrawal, Rudolf Rosa

We also augment a graph-based parser with eye-tracking features and parse the Dundee Corpus to corroborate our findings from the sequence labelling parser.

Dependency Parsing

TEAM UFAL @ CreativeSumm 2022: BART and SamSum based few-shot approach for creative Summarization

no code implementations COLING (CreativeSumm) 2022 Rishu Kumar, Rudolf Rosa

This system description paper details TEAM UFAL’s approach for the SummScreen, TVMegasite subtask of the CreativeSumm shared task.

Few-Shot Learning

Measuring Memorization Effect in Word-Level Neural Networks Probing

no code implementations29 Jun 2020 Rudolf Rosa, Tomáš Musil, David Mareček

In classical probing, a classifier is trained on the representations to extract the target linguistic information.

Machine Translation Memorization +1

Universal Dependencies according to BERT: both more specific and more general

2 code implementations Findings of the Association for Computational Linguistics 2020 Tomasz Limisiewicz, Rudolf Rosa, David Mareček

This work focuses on analyzing the form and extent of syntactic abstraction captured by BERT by extracting labeled dependency trees from self-attentions.

Relation

How Language-Neutral is Multilingual BERT?

1 code implementation8 Nov 2019 Jindřich Libovický, Rudolf Rosa, Alexander Fraser

Multilingual BERT (mBERT) provides sentence representations for 104 languages, which are useful for many multi-lingual tasks.

Retrieval Sentence +2

Unsupervised Lemmatization as Embeddings-Based Word Clustering

1 code implementation22 Aug 2019 Rudolf Rosa, Zdeněk Žabokrtský

We focus on the task of unsupervised lemmatization, i. e. grouping together inflected forms of one word under one label (a lemma) without the use of annotated training data.

Clustering LEMMA +1

Inducing Syntactic Trees from BERT Representations

no code implementations27 Jun 2019 Rudolf Rosa, David Mareček

We use the English model of BERT and explore how a deletion of one word in a sentence changes representations of other words.

Language Modelling Sentence

From Balustrades to Pierre Vinken: Looking for Syntax in Transformer Self-Attentions

no code implementations WS 2019 David Mareček, Rudolf Rosa

We inspect the multi-head self-attention in Transformer NMT encoders for three source languages, looking for patterns that could have a syntactic interpretation.

NMT Position

Extracting Syntactic Trees from Transformer Encoder Self-Attentions

no code implementations WS 2018 David Mare{\v{c}}ek, Rudolf Rosa

This is a work in progress about extracting the sentence tree structures from the encoder{'}s self-attention weights, when translating into another language using the Transformer neural network architecture.

Machine Translation Sentence

Slavic Forest, Norwegian Wood

no code implementations WS 2017 Rudolf Rosa, Daniel Zeman, David Mare{\v{c}}ek, Zden{\v{e}}k {\v{Z}}abokrtsk{\'y}

We once had a corp, or should we say, it once had us They showed us its tags, isn{'}t it great, unified tags They asked us to parse and they told us to use everything So we looked around and we noticed there was near nothing We took other langs, bitext aligned: words one-to-one We played for two weeks, and then they said, here is the test The parser kept training till morning, just until deadline So we had to wait and hope what we get would be just fine And, when we awoke, the results were done, we saw we{'}d won So, we wrote this paper, isn{'}t it good, Norwegian wood.

Dependency Parsing Machine Translation +1

Parsing Natural Language Sentences by Semi-supervised Methods

no code implementations16 Jun 2015 Rudolf Rosa

We present our work on semi-supervised parsing of natural language sentences, focusing on multi-source crosslingual transfer of delexicalized dependency parsers.

Cannot find the paper you are looking for? You can Submit a new open access paper.