Search Results for author: Lukasz Roguski

Found 5 papers, 1 papers with code

Speeding Up Transformer Training By Using Dataset Subsampling - An Exploratory Analysis

no code implementations EMNLP (sustainlp) 2021 Lovre Torbarina, Velimir Mihelčić, Bruno Šarlija, Lukasz Roguski, Željko Kraljević

Transformer-based models have greatly advanced the progress in the field of the natural language processing and while they achieve state-of-the-art results on a wide range of tasks, they are cumbersome in parameter size.

text-classification Text Classification

Challenges and Opportunities of Using Transformer-Based Multi-Task Learning in NLP Through ML Lifecycle: A Survey

no code implementations16 Aug 2023 Lovre Torbarina, Tin Ferkovic, Lukasz Roguski, Velimir Mihelcic, Bruno Sarlija, Zeljko Kraljevic

The increasing adoption of natural language processing (NLP) models across industries has led to practitioners' need for machine learning systems to handle these models efficiently, from training to serving them in production.

Continual Learning Multi-Task Learning

Deployment of a Free-Text Analytics Platform at a UK National Health Service Research Hospital: CogStack at University College London Hospitals

no code implementations15 Aug 2021 Kawsar Noor, Lukasz Roguski, Alex Handy, Roman Klapaukh, Amos Folarin, Luis Romao, Joshua Matteson, Nathan Lea, Leilei Zhu, Wai Keong Wong, Anoop Shah, Richard J Dobson

To tackle this problem at University College London Hospitals, we have deployed an enhanced version of the CogStack platform; an information retrieval platform with natural language processing capabilities which we have configured to process the hospital's existing and legacy records.

Information Retrieval Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.