Search Results for author: Seyed Iman Mirzadeh

Found 6 papers, 3 papers with code

Continual Learning Beyond a Single Model

no code implementations20 Feb 2022 Thang Doan, Seyed Iman Mirzadeh, Mehrdad Farajtabar

A growing body of research in continual learning focuses on the catastrophic forgetting problem.

Continual Learning

Architecture Matters in Continual Learning

no code implementations1 Feb 2022 Seyed Iman Mirzadeh, Arslan Chaudhry, Dong Yin, Timothy Nguyen, Razvan Pascanu, Dilan Gorur, Mehrdad Farajtabar

However, in this work, we show that the choice of architecture can significantly impact the continual learning performance, and different architectures lead to different trade-offs between the ability to remember previous tasks and learning new ones.

Continual Learning

Wide Neural Networks Forget Less Catastrophically

no code implementations21 Oct 2021 Seyed Iman Mirzadeh, Arslan Chaudhry, Dong Yin, Huiyi Hu, Razvan Pascanu, Dilan Gorur, Mehrdad Farajtabar

A primary focus area in continual learning research is alleviating the "catastrophic forgetting" problem in neural networks by designing new algorithms that are more robust to the distribution shifts.

Continual Learning

Inter-Beat Interval Estimation with Tiramisu Model: A Novel Approach with Reduced Error

1 code implementation1 Jul 2021 Asiful Arefeen, Ali Akbari, Seyed Iman Mirzadeh, Roozbeh Jafari, Behrooz A. Shirazi, Hassan Ghasemzadeh

However, extracting IBIs from noisy signals is challenging since the morphology of the signal is distorted in the presence of the noise.

Denoising Heart Rate Variability

Linear Mode Connectivity in Multitask and Continual Learning

1 code implementation ICLR 2021 Seyed Iman Mirzadeh, Mehrdad Farajtabar, Dilan Gorur, Razvan Pascanu, Hassan Ghasemzadeh

Continual (sequential) training and multitask (simultaneous) training are often attempting to solve the same overall objective: to find a solution that performs well on all considered tasks.

Continual Learning Linear Mode Connectivity

Understanding the Role of Training Regimes in Continual Learning

4 code implementations NeurIPS 2020 Seyed Iman Mirzadeh, Mehrdad Farajtabar, Razvan Pascanu, Hassan Ghasemzadeh

However, there has been limited prior work extensively analyzing the impact that different training regimes -- learning rate, batch size, regularization method-- can have on forgetting.

Continual Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.