Search Results for author: Ashkan Esmaeili

Found 15 papers, 3 papers with code

CNLL: A Semi-supervised Approach For Continual Noisy Label Learning

1 code implementation21 Apr 2022 Nazmul Karim, Umar Khalid, Ashkan Esmaeili, Nazanin Rahnavard

After purification, we perform fine-tuning in a semi-supervised fashion that ensures the participation of all available samples.

Continual Learning

RODD: A Self-Supervised Approach for Robust Out-of-Distribution Detection

1 code implementation6 Apr 2022 Umar Khalid, Ashkan Esmaeili, Nazmul Karim, Nazanin Rahnavard

The method proposed in this work referred to as RODD outperforms SOTA detection performance on an extensive suite of benchmark datasets on OOD detection tasks.

 Ranked #1 on Out-of-Distribution Detection on cifar100 (using extra training data)

Contrastive Learning Out-of-Distribution Detection +1

Generative Model Adversarial Training for Deep Compressed Sensing

no code implementations20 Jun 2021 Ashkan Esmaeili

Deep compressed sensing assumes the data has sparse representation in a latent space, i. e., it is intrinsically of low-dimension.

LSDAT: Low-Rank and Sparse Decomposition for Decision-based Adversarial Attack

no code implementations19 Mar 2021 Ashkan Esmaeili, Marzieh Edraki, Nazanin Rahnavard, Mubarak Shah, Ajmal Mian

It is set forth that the proposed sparse perturbation is the most aligned sparse perturbation with the shortest path from the input sample to the decision boundary for some initial adversarial sample (the best sparse approximation of shortest path, likely to fool the model).

Adversarial Attack Computational Efficiency +1

Asymptotic Optimality of Self-Representative Low-Rank Approximation and Its Applications

no code implementations1 Jan 2021 Saeed Vahidian, Mohsen Joneidi, Ashkan Esmaeili, Siavash Khodadadeh, Sharare Zehtabian, Ladislau Boloni, Nazanin Rahnavard, Bill Lin, Mubarak Shah

The approach is based on the concept of {\em self-rank}, defined as the minimum number of samples needed to reconstruct all samples with an accuracy proportional to the rank-$K$ approximation.

A Novel Approach to Quantized Matrix Completion Using Huber Loss Measure

no code implementations29 Oct 2018 Ashkan Esmaeili, Farokh Marvasti

Next, we form an unconstrained optimization problem by regularizing the rank function with Huber loss.

Matrix Completion Quantization

Transduction with Matrix Completion Using Smoothed Rank Function

no code implementations19 May 2018 Ashkan Esmaeili, Kayhan Behdin, Mohammad Amin Fakharian, Farokh Marvasti

In this paper, we propose two new algorithms for transduction with Matrix Completion (MC) problem.

Matrix Completion

OBTAIN: Real-Time Beat Tracking in Audio Signals

1 code implementation7 Apr 2017 Ali Mottaghi, Kayhan Behdin, Ashkan Esmaeili, Mohammadreza Heydari, Farokh Marvasti

In this paper, we design a system in order to perform the real-time beat tracking for an audio signal.

Online Beat Tracking

Fast Methods for Recovering Sparse Parameters in Linear Low Rank Models

no code implementations26 Jun 2016 Ashkan Esmaeili, Arash Amini, Farokh Marvasti

In this paper, we investigate the recovery of a sparse weight vector (parameters vector) from a set of noisy linear combinations.

Matrix Completion

Comparison of Several Sparse Recovery Methods for Low Rank Matrices with Random Samples

no code implementations12 Jun 2016 Ashkan Esmaeili, Farokh Marvasti

This paper will focus on comparing the power of IMAT in reconstruction of the desired sparse signal with LASSO.

BIG-bench Machine Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.