Search Results for author: Velimir Mihelčić

Found 1 papers, 0 papers with code

Speeding Up Transformer Training By Using Dataset Subsampling - An Exploratory Analysis

no code implementations EMNLP (sustainlp) 2021 Lovre Torbarina, Velimir Mihelčić, Bruno Šarlija, Lukasz Roguski, Željko Kraljević

Transformer-based models have greatly advanced the progress in the field of the natural language processing and while they achieve state-of-the-art results on a wide range of tasks, they are cumbersome in parameter size.

text-classification Text Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.