Search Results for author: Maryam Ziaeefard

Found 5 papers, 2 papers with code

CES-KD: Curriculum-based Expert Selection for Guided Knowledge Distillation

no code implementations15 Sep 2022 Ibtihel Amara, Maryam Ziaeefard, Brett H. Meyer, Warren Gross, James J. Clark

Knowledge distillation (KD) is an effective tool for compressing deep classification models for edge devices.

Knowledge Distillation

Efficient Fine-Tuning of Compressed Language Models with Learners

no code implementations3 Aug 2022 Danilo Vucetic, Mohammadreza Tayaranian, Maryam Ziaeefard, James J. Clark, Brett H. Meyer, Warren J. Gross

We introduce Learner modules and priming, novel methods for fine-tuning that exploit the overparameterization of pre-trained language models to gain benefits in convergence speed and resource utilization.

CoLA Navigate

Efficient Fine-Tuning of BERT Models on the Edge

no code implementations3 May 2022 Danilo Vucetic, Mohammadreza Tayaranian, Maryam Ziaeefard, James J. Clark, Brett H. Meyer, Warren J. Gross

FAR reduces fine-tuning time on the DistilBERT model and CoLA dataset by 30%, and time spent on memory operations by 47%.

CoLA

Towards Knowledge-Augmented Visual Question Answering

1 code implementation COLING 2020 Maryam Ziaeefard, Freddy Lecue

We propose a model that captures the interactions between objects in a visual scene and entities in an external knowledge source.

General Knowledge Graph Attention +2

Cannot find the paper you are looking for? You can Submit a new open access paper.