no code implementations • 8 Dec 2023 • Saeejith Nair, Mohammad Javad Shafiee, Alexander Wong
We present DARLEI, a framework that combines evolutionary algorithms with parallelized reinforcement learning for efficiently training and evolving populations of UNIMAL agents.
no code implementations • 26 Sep 2023 • Amir Nazemi, Mohammad Javad Shafiee, Zahra Gharaee, Paul Fieguth
We propose two novel techniques to reduce the memory requirement of Online VOS methods while improving modeling accuracy and generalization on long videos.
no code implementations • 25 Sep 2023 • Saeejith Nair, Yuhao Chen, Mohammad Javad Shafiee, Alexander Wong
Thus, there is a need to dynamically optimize the neural network component of NeRFs to achieve a balance between computational complexity and specific targets for synthesis quality.
no code implementations • 21 Apr 2023 • Alexander Wong, Yifan Wu, Saad Abbasi, Saeejith Nair, Yuhao Chen, Mohammad Javad Shafiee
As such, the design of highly efficient multi-task deep neural network architectures tailored for computer vision tasks for robotic grasping on the edge is highly desired for widespread adoption in manufacturing environments.
no code implementations • 20 Dec 2022 • Carol Xu, Mahmoud Famouri, Gautam Bathla, Mohammad Javad Shafiee, Alexander Wong
As such, the proposed deep learning-driven workflow, integrated with the aforementioned LightDefectNet neural network, is highly suited for high-throughput, high-performance light plate surface VQI within real-world manufacturing environments.
no code implementations • 15 Aug 2022 • Alexander Wong, Mohammad Javad Shafiee, Saad Abbasi, Saeejith Nair, Mahmoud Famouri
With the growing adoption of deep learning for on-device TinyML applications, there has been an ever-increasing demand for efficient neural network backbones optimized for the edge.
no code implementations • 25 May 2022 • Saad Abbasi, Alexander Wong, Mohammad Javad Shafiee
Deep neural network (DNN) latency characterization is a time-consuming process and adds significant cost to Neural Architecture Search (NAS) processes when searching for efficient convolutional neural networks for embedded vision applications.
no code implementations • 27 Apr 2022 • Saeejith Nair, Saad Abbasi, Alexander Wong, Mohammad Javad Shafiee
Neural Architecture Search (NAS) has enabled automatic discovery of more efficient neural network architectures, especially for mobile and embedded vision applications.
no code implementations • 25 Apr 2022 • Carol Xu, Mahmoud Famouri, Gautam Bathla, Mohammad Javad Shafiee, Alexander Wong
Light guide plates are essential optical components widely used in a diverse range of applications ranging from medical lighting fixtures to back-lit TV displays.
no code implementations • 25 Apr 2022 • Carol Xu, Mahmoud Famouri, Gautam Bathla, Saeejith Nair, Mohammad Javad Shafiee, Alexander Wong
Photovoltaic cells are electronic devices that convert light energy to electricity, forming the backbone of solar energy harvesting systems.
1 code implementation • 24 Apr 2022 • Hossein Aboutalebi, Maya Pavlova, Mohammad Javad Shafiee, Adrian Florea, Andrew Hryniowski, Alexander Wong
Since the World Health Organization declared COVID-19 a pandemic in 2020, the global community has faced ongoing challenges in controlling and mitigating the transmission of the SARS-CoV-2 virus, as well as its evolving subvariants and recombinants.
no code implementations • 30 Nov 2021 • Saad Abbasi, Alexander Wong, Mohammad Javad Shafiee
Through this quantitative strategy as the hardware descriptor, MAPLE can generalize to new hardware via a few shot adaptation strategy where with as few as 3 samples it exhibits a 6% improvement over state-of-the-art methods requiring as much as 10 samples.
no code implementations • 29 Nov 2021 • Mohammad Javad Shafiee, Mahmoud Famouri, Gautam Bathla, Francis Li, Alexander Wong
A critical aspect in the manufacturing process is the visual quality inspection of manufactured components for defects and flaws.
no code implementations • 12 Oct 2021 • Hossein Aboutalebi, Maya Pavlova, Hayden Gunraj, Mohammad Javad Shafiee, Ali Sabri, Amer Alaref, Alexander Wong
In this work, we explore the concept of self-attention for tackling such subtleties in and between diseases.
no code implementations • 8 Jul 2021 • Saad Abbasi, Mohammad Javad Shafiee, Ellick Chan, Alexander Wong
In this study, a comprehensive empirical exploration is conducted to investigate the impact of deep neural network architecture design on the degree of inference speedup that can be achieved via hardware-specific acceleration.
no code implementations • 18 Jun 2021 • Hossein Aboutalebi, Mohammad Javad Shafiee, Michelle Karg, Christian Scharfenberger, Alexander Wong
Motivated by this, this study presents the concept of residual error, a new performance measure for not only assessing the adversarial robustness of a deep neural network at the individual sample level, but also can be used to differentiate between adversarial and non-adversarial examples to facilitate for adversarial example detection.
no code implementations • 4 May 2021 • Hossein Aboutalebi, Saad Abbasi, Mohammad Javad Shafiee, Alexander Wong
The health and socioeconomic difficulties caused by the COVID-19 pandemic continues to cause enormous tensions around the world.
no code implementations • 1 May 2021 • Hossein Aboutalebi, Maya Pavlova, Mohammad Javad Shafiee, Ali Sabri, Amer Alaref, Alexander Wong
More specifically, we leveraged transfer learning to transfer representational knowledge gained from over 16, 000 CXR images from a multinational cohort of over 15, 000 patient cases into a custom network architecture for severity assessment.
no code implementations • 31 Mar 2021 • Saad Abbasi, Mahmoud Famouri, Mohammad Javad Shafiee, Alexander Wong
Human operators often diagnose industrial machinery via anomalous sounds.
no code implementations • 25 Dec 2020 • Ahmadreza Jeddi, Mohammad Javad Shafiee, Alexander Wong
Adversarial Training (AT) with Projected Gradient Descent (PGD) is an effective approach for improving the robustness of the deep neural networks.
no code implementations • 30 Sep 2020 • Alexander Wong, Mahmoud Famouri, Mohammad Javad Shafiee
Based on these promising results, AttendNets illustrate the effectiveness of visual attention condensers as building blocks for enabling various on-device visual perception tasks for TinyML applications.
no code implementations • 1 Aug 2020 • Hossein Aboutalebi, Mohammad Javad Shafiee, Michelle Karg, Christian Scharfenberger, Alexander Wong
In this study, we investigate the effect of adversarial machine learning on the bias and variance of a trained deep neural network and analyze how adversarial perturbations can affect the generalization of a network.
no code implementations • 4 Mar 2020 • Mohammad Javad Shafiee, Ahmadreza Jeddi, Amir Nazemi, Paul Fieguth, Alexander Wong
This paper analyzes the robustness of deep learning models in autonomous driving applications and discusses the practical solutions to address that.
1 code implementation • CVPR 2020 • Ahmadreza Jeddi, Mohammad Javad Shafiee, Michelle Karg, Christian Scharfenberger, Alexander Wong
In this study, we introduce Learn2Perturb, an end-to-end feature perturbation learning approach for improving the adversarial robustness of deep neural networks.
no code implementations • 16 Oct 2019 • Zhong Qiu Lin, Mohammad Javad Shafiee, Stanislav Bochkarev, Michael St. Jules, Xiao Yu Wang, Alexander Wong
A comprehensive analysis using this approach was conducted on several state-of-the-art explainability methods (LIME, SHAP, Expected Gradients, GSInquire) on a ResNet-50 deep convolutional neural network using a subset of ImageNet for the task of image classification.
no code implementations • 15 Oct 2019 • Mohammad Javad Shafiee, Andrew Hryniowski, Francis Li, Zhong Qiu Lin, Alexander Wong
A particularly interesting class of compact architecture search algorithms are those that are guided by baseline network architectures.
4 code implementations • 3 Oct 2019 • Alexander Wong, Mahmoud Famuori, Mohammad Javad Shafiee, Francis Li, Brendan Chwyl, Jonathan Chung
As such, there has been growing research interest in the design of efficient deep neural network architectures catered for edge and mobile usage.
no code implementations • 12 Sep 2019 • Mohammad Javad Shafiee, Mirko Nentwig, Yohannes Kassahun, Francis Li, Stanislav Bochkarev, Akif Kamal, David Dolson, Secil Altintas, Arif Virani, Alexander Wong
The findings of this case study showed that GenSynth is easy to use and can be effective at accelerating the design and production of compact, customized deep neural network.
no code implementations • 5 Nov 2018 • Mohammad Saeed Shafiee, Mohammad Javad Shafiee, Alexander Wong
The proposed d-gate modules can be integrated with any deep neural network and reduces the average computational cost of the deep neural networks while maintaining modeling accuracy.
no code implementations • NIPS Workshop CDNNRIA 2018 • Mohammad Saeed Shafiee, Mohammad Javad Shafiee, Alexander Wong
The current trade-off between depth and computational cost makes it difficult to adopt deep neural networks for many industrial applications, especially when computing power is limited.
no code implementations • 17 Sep 2018 • Alexander Wong, Mohammad Javad Shafiee, Brendan Chwyl, Francis Li
In this study, we introduce the idea of generative synthesis, which is premised on the intricate interplay between a generator-inquisitor pair that work in tandem to garner insights and learn to generate highly efficient deep neural networks that best satisfies operational requirements.
no code implementations • 8 Jun 2018 • Amir Nazemi, Mohammad Javad Shafiee, Zohreh Azimifar, Alexander Wong
Here, we formulate the vehicle make and model recognition as a fine-grained classification problem and propose a new configurable on-road vehicle make and model recognition framework.
1 code implementation • 28 Mar 2018 • Alexander Wong, Mohammad Javad Shafiee, Michael St. Jules
The resulting MicronNet possesses a model size of just ~1MB and ~510, 000 parameters (~27x fewer parameters than state-of-the-art) while still achieving a human performance level top-1 accuracy of 98. 9% on the German traffic sign recognition benchmark.
Ranked #4 on
Traffic Sign Recognition
on GTSRB
1 code implementation • 19 Feb 2018 • Alexander Wong, Mohammad Javad Shafiee, Francis Li, Brendan Chwyl
The resulting Tiny SSD possess a model size of 2. 3MB (~26X smaller than Tiny YOLO) while still achieving an mAP of 61. 3% on VOC 2007 (~4. 2% higher than Tiny YOLO).
no code implementations • 16 Jan 2018 • Mohammad Javad Shafiee, Brendan Chwyl, Francis Li, Rongyan Chen, Michelle Karg, Christian Scharfenberger, Alexander Wong
The computational complexity of leveraging deep neural networks for extracting deep feature representations is a significant barrier to its widespread adoption, particularly for use in embedded devices.
no code implementations • 20 Nov 2017 • Mohammad Javad Shafiee, Francis Li, Brendan Chwyl, Alexander Wong
While deep neural networks have been shown in recent years to outperform other machine learning methods in a wide range of applications, one of the biggest challenges with enabling deep neural networks for widespread deployment on edge devices such as mobile and other consumer devices is high computational and memory requirements.
no code implementations • 24 Sep 2017 • Mohammad Javad Shafiee, Alexander Wong
While skin cancer is the most diagnosed form of cancer in men and women, with more cases diagnosed each year than all other cancers combined, sufficiently early diagnosis results in very good prognosis and as such makes early detection crucial.
1 code implementation • 18 Sep 2017 • Mohammad Javad Shafiee, Brendan Chywl, Francis Li, Alexander Wong
Object detection is considered one of the most challenging problems in this field of computer vision, as it involves the combination of object classification and object localization within a scene.
no code implementations • 7 Sep 2017 • Audrey Chung, Mohammad Javad Shafiee, Paul Fieguth, Alexander Wong
Evolutionary deep intelligence was recently proposed as a method for achieving highly efficient deep neural network architectures over successive generations.
no code implementations • 1 Jul 2017 • Mohammad Javad Shafiee, Francis Li, Alexander Wong
A key contributing factor to incredible success of deep neural networks has been the significant rise on massively parallel computing devices allowing researchers to greatly increase the size and depth of deep neural networks, leading to significant improvements in modeling accuracy.
no code implementations • 10 May 2017 • Mohammad Javad Shafiee, Audrey G. Chung, Farzad Khalvati, Masoom A. Haider, Alexander Wong
We evaluated the evolved deep radiomic sequencer (EDRS) discovered via the proposed evolutionary deep radiomic sequencer discovery framework against state-of-the-art radiomics-driven and discovery radiomics methods using clinical lung CT data with pathologically-proven diagnostic data from the LIDC-IDRI dataset.
no code implementations • 7 Apr 2017 • Mohammad Javad Shafiee, Elnaz Barshan, Alexander Wong
In this study, we take a deeper look at the notion of synaptic cluster-driven evolution of deep neural networks which guides the evolution process towards the formation of a highly sparse set of synaptic clusters in offspring networks.
no code implementations • 6 Sep 2016 • Mohammad Javad Shafiee, Alexander Wong
There has been significant recent interest towards achieving highly efficient deep neural network architectures.
1 code implementation • 14 Jun 2016 • Mohammad Javad Shafiee, Akshaya Mishra, Alexander Wong
Taking inspiration from biological evolution, we explore the idea of "Can deep neural networks evolve naturally over successive generations into highly efficient deep neural networks?"
no code implementations • 1 Feb 2016 • Parthipan Siva, Mohammad Javad Shafiee, Mike Jamieson, Alexander Wong
The problem of automated crowd segmentation and counting has garnered significant interest in the field of video surveillance.
no code implementations • 25 Dec 2015 • Edward Li, Farzad Khalvati, Mohammad Javad Shafiee, Masoom A. Haider, Alexander Wong
Reducing MRI acquisition time can reduce patient discomfort and as a result reduces motion artifacts from the acquisition process.
no code implementations • 18 Dec 2015 • Mohammad Javad Shafiee, Parthipan Siva, Paul Fieguth, Alexander Wong
Transfer learning is a recent field of machine learning research that aims to resolve the challenge of dealing with insufficient training data in the domain of interest.
no code implementations • 15 Dec 2015 • Ameneh Boroomand, Mohammad Javad Shafiee, Farzad Khalvati, Masoom A. Haider, Alexander Wong
Retrospective bias correction approaches are introduced as a more efficient way of bias correction compared to the prospective methods such that they correct for both of the scanner and anatomy-related bias fields in MR imaging.
no code implementations • 11 Dec 2015 • Mohammad Javad Shafiee, Parthipan Siva, Paul Fieguth, Alexander Wong
Experimental results show that features learned using deep convolutional StochasticNets, with fewer neural connections than conventional deep convolutional neural networks, can allow for better or comparable classification accuracy than conventional deep neural networks: relative test error decrease of ~4. 5% for classification on the STL-10 dataset and ~1% for classification on the SVHN dataset.
no code implementations • 11 Nov 2015 • Mohammad Javad Shafiee, Audrey G. Chung, Devinder Kumar, Farzad Khalvati, Masoom Haider, Alexander Wong
In this study, we introduce a novel discovery radiomics framework where we directly discover custom radiomic features from the wealth of available medical imaging data.
no code implementations • 1 Sep 2015 • Audrey G. Chung, Mohammad Javad Shafiee, Devinder Kumar, Farzad Khalvati, Masoom A. Haider, Alexander Wong
In this study, we propose a novel \textit{discovery radiomics} framework for generating custom radiomic sequences tailored for prostate cancer detection.
no code implementations • 1 Sep 2015 • Devinder Kumar, Mohammad Javad Shafiee, Audrey G. Chung, Farzad Khalvati, Masoom A. Haider, Alexander Wong
In this study, we take the idea of radiomics one step further by introducing the concept of discovery radiomics for lung cancer prediction using CT imaging data.
no code implementations • 22 Aug 2015 • Mohammad Javad Shafiee, Parthipan Siva, Alexander Wong
A pivotal study on the brain tissue of rats found that synaptic formation for specific functional connectivity in neocortical neural microcircuits can be surprisingly well modeled and predicted as a random formation.
no code implementations • 30 Jun 2015 • Mohammad Javad Shafiee, Alexander Wong, Paul Fieguth
However, the issue of computational tractability becomes a significant issue when incorporating such long-range nodal interactions, particularly when a large number of long-range nodal interactions (e. g., fully-connected random fields) are modeled.
no code implementations • 20 Dec 2014 • Alexander Wong, Mohammad Javad Shafiee, Parthipan Siva, Xiao Yu Wang
In this study, we investigate the feasibility of unifying fully-connected and deep-structured models in a computationally tractable manner for the purpose of structured inference.