no code implementations • 15 Feb 2024 • Eugenio Ressa, Alberto Marchisio, Maurizio Martina, Guido Masera, Muhammad Shafique
Towards this, we design a hardware architecture, TinyCL, to perform CL on resource-constrained autonomous systems.
no code implementations • 10 Aug 2023 • Farzad Nikfam, Raffaele Casaburi, Alberto Marchisio, Maurizio Martina, Muhammad Shafique
Machine learning (ML) is widely used today, especially through deep neural networks (DNNs), however, increasing computational load and resource requirements have led to cloud-based solutions.
no code implementations • 8 Apr 2023 • Alberto Marchisio, Antonio De Marco, Alessio Colucci, Maurizio Martina, Muhammad Shafique
Overall, CapsNets achieve better robustness against adversarial examples and affine transformations, compared to a traditional CNN with a similar number of parameters.
1 code implementation • 8 Apr 2023 • Alberto Marchisio, Davide Dura, Maurizio Capra, Maurizio Martina, Guido Masera, Muhammad Shafique
In particular, fixed-point quantization is desirable to ease the computations using lightweight blocks, like adders and multipliers, of the underlying hardware.
1 code implementation • 13 Oct 2022 • Farzad Nikfam, Alberto Marchisio, Maurizio Martina, Muhammad Shafique
The experiments show comparable results with the related works, and in several experiments, the adversarial training of DNNs using our AccelAT framework is conducted up to 2 times faster than the existing techniques.
1 code implementation • 11 Oct 2022 • Alberto Marchisio, Vojtech Mrazek, Andrea Massa, Beatrice Bussolino, Maurizio Martina, Muhammad Shafique
Neural Architecture Search (NAS) algorithms aim at finding efficient Deep Neural Network (DNN) architectures for a given application under given system constraints.
no code implementations • 3 Aug 2022 • Alberto Viale, Alberto Marchisio, Maurizio Martina, Guido Masera, Muhammad Shafique
Autonomous Driving (AD) related features represent important elements for the next generation of mobile robots and autonomous vehicles focused on increasingly intelligent, autonomous, and interconnected systems.
no code implementations • 31 Jul 2022 • Muhammad Abdullah Hanif, Giuseppe Maria Sarda, Alberto Marchisio, Guido Masera, Maurizio Martina, Muhammad Shafique
The high computational complexity of these networks, which translates to increased energy consumption, is the foremost obstacle towards deploying large DNNs in resource-constrained systems.
no code implementations • 21 Jun 2022 • Alberto Marchisio, Beatrice Bussolino, Edoardo Salvati, Maurizio Martina, Guido Masera, Muhammad Shafique
In our experiments, we evaluate tradeoffs between area, power consumption, and critical path delay of the designs implemented with the ASIC design flow, and the accuracy of the quantized CapsNets, compared to the exact functions.
no code implementations • 27 May 2022 • Alberto Marchisio, Giovanni Caramia, Maurizio Martina, Muhammad Shafique
Recently, Deep Neural Networks (DNNs) have achieved remarkable performances in many applications, while several studies have enhanced their vulnerabilities to malicious attacks.
1 code implementation • 1 Sep 2021 • Alberto Marchisio, Giacomo Pira, Maurizio Martina, Guido Masera, Muhammad Shafique
Spiking Neural Networks (SNNs) aim at providing energy-efficient learning capabilities when implemented on neuromorphic chips with event-based Dynamic Vision Sensors (DVS).
1 code implementation • 1 Jul 2021 • Alberto Viale, Alberto Marchisio, Maurizio Martina, Guido Masera, Muhammad Shafique
Our best experiment achieves an accuracy on offline implementation of 86%, that drops to 83% when it is ported onto the Loihi Chip.
1 code implementation • 1 Jul 2021 • Alberto Marchisio, Giacomo Pira, Maurizio Martina, Guido Masera, Muhammad Shafique
Spiking Neural Networks (SNNs), despite being energy-efficient when implemented on neuromorphic hardware and coupled with event-based Dynamic Vision Sensors (DVS), are vulnerable to security threats, such as adversarial attacks, i. e., small perturbations added to the input for inducing a misclassification.
no code implementations • 21 Dec 2020 • Maurizio Capra, Beatrice Bussolino, Alberto Marchisio, Guido Masera, Maurizio Martina, Muhammad Shafique
Currently, Machine Learning (ML) is becoming ubiquitous in everyday life.
1 code implementation • 19 Aug 2020 • Alberto Marchisio, Andrea Massa, Vojtech Mrazek, Beatrice Bussolino, Maurizio Martina, Muhammad Shafique
Deep Neural Networks (DNNs) have made significant improvements to reach the desired accuracy to be employed in a wide variety of Machine Learning (ML) applications.
no code implementations • 16 May 2020 • Valerio Venceslai, Alberto Marchisio, Ihsen Alouani, Maurizio Martina, Muhammad Shafique
Due to their proven efficiency, machine-learning systems are deployed in a wide range of complex real-life problems.
1 code implementation • 16 May 2020 • Riccardo Massa, Alberto Marchisio, Maurizio Martina, Muhammad Shafique
Towards the conversion from a DNN to an SNN, we perform a comprehensive analysis of such process, specifically designed for Intel Loihi, showing our methodology for the design of an SNN that achieves nearly the same accuracy results as its corresponding DNN.
no code implementations • 15 Apr 2020 • Alberto Marchisio, Beatrice Bussolino, Alessio Colucci, Maurizio Martina, Guido Masera, Muhammad Shafique
Capsule Networks (CapsNets), recently proposed by the Google Brain team, have superior learning capabilities in machine learning tasks, like image classification, compared to the traditional CNNs.
1 code implementation • 24 May 2019 • Alberto Marchisio, Beatrice Bussolino, Alessio Colucci, Muhammad Abdullah Hanif, Maurizio Martina, Guido Masera, Muhammad Shafique
The goal is to reduce the hardware requirements of CapsNets by removing unused/redundant connections and capsules, while keeping high accuracy through tests of different learning rate policies and batch sizes.
no code implementations • 4 Feb 2019 • Alberto Marchisio, Giorgio Nanfa, Faiq Khalid, Muhammad Abdullah Hanif, Maurizio Martina, Muhammad Shafique
We perform an in-depth evaluation for a Spiking Deep Belief Network (SDBN) and a DNN having the same number of layers and neurons (to obtain a fair comparison), in order to study the efficiency of our methodology and to understand the differences between SNNs and DNNs w. r. t.
no code implementations • 28 Jan 2019 • Alberto Marchisio, Giorgio Nanfa, Faiq Khalid, Muhammad Abdullah Hanif, Maurizio Martina, Muhammad Shafique
Capsule Networks preserve the hierarchical spatial relationships between objects, and thereby bears a potential to surpass the performance of traditional Convolutional Neural Networks (CNNs) in performing tasks like image classification.