Search Results for author: Danilo Vucetic

Found 3 papers, 1 papers with code

Efficient Fine-Tuning of Compressed Language Models with Learners

no code implementations3 Aug 2022 Danilo Vucetic, Mohammadreza Tayaranian, Maryam Ziaeefard, James J. Clark, Brett H. Meyer, Warren J. Gross

We introduce Learner modules and priming, novel methods for fine-tuning that exploit the overparameterization of pre-trained language models to gain benefits in convergence speed and resource utilization.

CoLA Navigate

Efficient Fine-Tuning of BERT Models on the Edge

no code implementations3 May 2022 Danilo Vucetic, Mohammadreza Tayaranian, Maryam Ziaeefard, James J. Clark, Brett H. Meyer, Warren J. Gross

FAR reduces fine-tuning time on the DistilBERT model and CoLA dataset by 30%, and time spent on memory operations by 47%.

CoLA

Cannot find the paper you are looking for? You can Submit a new open access paper.