Search Results for author: Omid Rohanian

Found 16 papers, 7 papers with code

MiniALBERT: Model Distillation via Parameter-Efficient Recursive Transformers

1 code implementation12 Oct 2022 Mohammadmahdi Nouriborji, Omid Rohanian, Samaneh Kouchaki, David A. Clifton

Different strategies have been proposed in the literature to alleviate these problems, with the aim to create effective compact models that nearly match the performance of their bloated counterparts with negligible performance losses.

On the Effectiveness of Compact Biomedical Transformers

1 code implementation7 Sep 2022 Omid Rohanian, Mohammadmahdi Nouriborji, Samaneh Kouchaki, David A. Clifton

Language models pre-trained on biomedical corpora, such as BioBERT, have recently shown promising results on downstream biomedical tasks.

Continual Learning Knowledge Distillation +1

Nowruz at SemEval-2022 Task 7: Tackling Cloze Tests with Transformers and Ordinal Regression

1 code implementation SemEval (NAACL) 2022 Mohammadmahdi Nouriborji, Omid Rohanian, David Clifton

This paper outlines the system using which team Nowruz participated in SemEval 2022 Task 7 Identifying Plausible Clarifications of Implicit and Underspecified Phrases for both subtasks A and B.

Multi-Task Learning regression

Privacy-aware Early Detection of COVID-19 through Adversarial Training

no code implementations9 Jan 2022 Omid Rohanian, Samaneh Kouchaki, Andrew Soltan, Jenny Yang, Morteza Rohanian, Yang Yang, David Clifton

One of our main contributions is that we specifically target the development of effective COVID-19 detection models with built-in mechanisms in order to selectively protect sensitive attributes against adversarial attacks.

Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions

no code implementations WS 2019 Shiva Taslimipoor, Omid Rohanian, Le An Ha

Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems.

Cross-Lingual Transfer Dependency Parsing +1

Bridging the Gap: Attending to Discontinuity in Identification of Multiword Expressions

2 code implementations NAACL 2019 Omid Rohanian, Shiva Taslimipoor, Samaneh Kouchaki, Le An Ha, Ruslan Mitkov

We introduce a new method to tag Multiword Expressions (MWEs) using a linguistically interpretable language-independent deep learning architecture.

TAG

SHOMA at Parseme Shared Task on Automatic Identification of VMWEs: Neural Multiword Expression Tagging with High Generalisation

1 code implementation9 Sep 2018 Shiva Taslimipoor, Omid Rohanian

This paper presents a language-independent deep learning architecture adapted to the task of multiword expression (MWE) identification.

Word Embeddings

WLV at SemEval-2018 Task 3: Dissecting Tweets in Search of Irony

no code implementations SEMEVAL 2018 Omid Rohanian, Shiva Taslimipoor, Richard Evans, Ruslan Mitkov

This paper describes the systems submitted to SemEval 2018 Task 3 {``}Irony detection in English tweets{''} for both subtasks A and B.

Sentiment Analysis

Combining Multiple Corpora for Readability Assessment for People with Cognitive Disabilities

no code implementations WS 2017 Victoria Yaneva, Constantin Or{\u{a}}san, Richard Evans, Omid Rohanian

Given the lack of large user-evaluated corpora in disability-related NLP research (e. g. text simplification or readability assessment for people with cognitive disabilities), the question of choosing suitable training data for NLP models is not straightforward.

Text Simplification

Using Gaze Data to Predict Multiword Expressions

no code implementations RANLP 2017 Omid Rohanian, Shiva Taslimipoor, Victoria Yaneva, Le An Ha

In recent years gaze data has been increasingly used to improve and evaluate NLP models due to the fact that it carries information about the cognitive processing of linguistic phenomena.

Part-Of-Speech Tagging POS +1

Cannot find the paper you are looking for? You can Submit a new open access paper.