Search Results for author: Mohammadmahdi Nouriborji

Found 5 papers, 5 papers with code

MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers

1 code implementation12 Oct 2022 Mohammadmahdi Nouriborji, Omid Rohanian, Samaneh Kouchaki, David A. Clifton

Different strategies have been proposed in the literature to alleviate these problems, with the aim to create effective compact models that nearly match the performance of their bloated counterparts with negligible performance losses.

On the Effectiveness of Compact Biomedical Transformers

1 code implementation7 Sep 2022 Omid Rohanian, Mohammadmahdi Nouriborji, Samaneh Kouchaki, David A. Clifton

Language models pre-trained on biomedical corpora, such as BioBERT, have recently shown promising results on downstream biomedical tasks.

Continual Learning Knowledge Distillation +1

Nowruz at SemEval-2022 Task 7: Tackling Cloze Tests with Transformers and Ordinal Regression

1 code implementation SemEval (NAACL) 2022 Mohammadmahdi Nouriborji, Omid Rohanian, David Clifton

This paper outlines the system using which team Nowruz participated in SemEval 2022 Task 7 Identifying Plausible Clarifications of Implicit and Underspecified Phrases for both subtasks A and B.

Multi-Task Learning regression

Cannot find the paper you are looking for? You can Submit a new open access paper.