Search Results for author: Abbas Rahimi

Found 28 papers, 12 papers with code

12 mJ per Class On-Device Online Few-Shot Class-Incremental Learning

1 code implementation12 Mar 2024 Yoga Esa Wibowo, Cristian Cioflan, Thorir Mar Ingolfsson, Michael Hersche, Leo Zhao, Abbas Rahimi, Luca Benini

In this work, we introduce Online Few-Shot Class-Incremental Learning (O-FSCIL), based on a lightweight model consisting of a pretrained and metalearned feature extractor and an expandable explicit memory storing the class prototypes.

Few-Shot Class-Incremental Learning Incremental Learning

Zero-shot Classification using Hyperdimensional Computing

no code implementations30 Jan 2024 Samuele Ruffino, Geethan Karunaratne, Michael Hersche, Luca Benini, Abu Sebastian, Abbas Rahimi

Classification based on Zero-shot Learning (ZSL) is the ability of a model to classify inputs into novel classes on which the model has not previously seen any training examples.

Attribute Attribute Extraction +2

Probabilistic Abduction for Visual Abstract Reasoning via Learning Rules in Vector-symbolic Architectures

1 code implementation29 Jan 2024 Michael Hersche, Francesco Di Stefano, Thomas Hofmann, Abu Sebastian, Abbas Rahimi

Abstract reasoning is a cornerstone of human intelligence, and replicating it with artificial intelligence (AI) presents an ongoing challenge.

Attribute

TCNCA: Temporal Convolution Network with Chunked Attention for Scalable Sequence Processing

no code implementations9 Dec 2023 Aleksandar Terzic, Michael Hersche, Geethan Karunaratne, Luca Benini, Abu Sebastian, Abbas Rahimi

We build upon their approach by replacing the linear recurrence with a special temporal convolutional network which permits larger receptive field size with shallower networks, and reduces the computational complexity to $O(L)$.

Language Modelling

MIMONets: Multiple-Input-Multiple-Output Neural Networks Exploiting Computation in Superposition

1 code implementation NeurIPS 2023 Nicolas Menet, Michael Hersche, Geethan Karunaratne, Luca Benini, Abu Sebastian, Abbas Rahimi

MIMONets augment various deep neural network architectures with variable binding mechanisms to represent an arbitrary number of inputs in a compositional data structure via fixed-width distributed representations.

Model-Driven Engineering for Artificial Intelligence -- A Systematic Literature Review

1 code implementation10 Jul 2023 Simon Raedler, Luca Berardinelli, Karolin Winter, Abbas Rahimi, Stefanie Rinderle-Ma

Objective: This study aims to investigate the existing body of knowledge in the field of Model-Driven Engineering MDE in support of AI (MDE4AI) to sharpen future research further and define the current state of the art.

Factorizers for Distributed Sparse Block Codes

no code implementations24 Mar 2023 Michael Hersche, Aleksandar Terzic, Geethan Karunaratne, Jovin Langenegger, Angéline Pouget, Giovanni Cherubini, Luca Benini, Abu Sebastian, Abbas Rahimi

We provide a methodology to flexibly integrate our factorizer in the classification layer of CNNs with a novel loss function.

Attribute

In-memory factorization of holographic perceptual representations

1 code implementation9 Nov 2022 Jovin Langenegger, Geethan Karunaratne, Michael Hersche, Luca Benini, Abu Sebastian, Abbas Rahimi

Disentanglement of constituent factors of a sensory signal is central to perception and cognition and hence is a critical task for future artificial intelligence systems.

Disentanglement

In-memory Realization of In-situ Few-shot Continual Learning with a Dynamically Evolving Explicit Memory

no code implementations14 Jul 2022 Geethan Karunaratne, Michael Hersche, Jovin Langenegger, Giovanni Cherubini, Manuel Le Gallo-Bourdeau, Urs Egger, Kevin Brew, Sam Choi, INJO OK, Mary Claire Silvestre, Ning li, Nicole Saulnier, Victor Chan, Ishtiaq Ahsan, Vijay Narayanan, Luca Benini, Abu Sebastian, Abbas Rahimi

We demonstrate for the first time how the EM unit can physically superpose multiple training examples, expand to accommodate unseen classes, and perform similarity search during inference, using operations on an IMC core based on phase-change memory (PCM).

Continual Learning

Constrained Few-shot Class-incremental Learning

2 code implementations CVPR 2022 Michael Hersche, Geethan Karunaratne, Giovanni Cherubini, Luca Benini, Abu Sebastian, Abbas Rahimi

Moreover, it is imperative that such learning must respect certain memory and computational constraints such as (i) training samples are limited to only a few per class, (ii) the computational cost of learning a novel class remains constant, and (iii) the memory footprint of the model grows at most linearly with the number of classes observed.

continual few-shot learning Few-Shot Class-Incremental Learning +1

Generalized Key-Value Memory to Flexibly Adjust Redundancy in Memory-Augmented Networks

no code implementations11 Mar 2022 Denis Kleyko, Geethan Karunaratne, Jan M. Rabaey, Abu Sebastian, Abbas Rahimi

Memory-augmented neural networks enhance a neural network with an external key-value memory whose complexity is typically dominated by the number of support vectors in the key memory.

A Neuro-vector-symbolic Architecture for Solving Raven's Progressive Matrices

1 code implementation9 Mar 2022 Michael Hersche, Mustafa Zeqiri, Luca Benini, Abu Sebastian, Abbas Rahimi

Compared to state-of-the-art deep neural network and neuro-symbolic approaches, end-to-end training of NVSA achieves a new record of 87. 7% average accuracy in RAVEN, and 88. 1% in I-RAVEN datasets.

Logical Reasoning

A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part II: Applications, Cognitive Models, and Challenges

no code implementations12 Nov 2021 Denis Kleyko, Dmitri A. Rachkovskij, Evgeny Osipov, Abbas Rahimi

This is Part II of the two-part comprehensive survey devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA).

A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations

no code implementations11 Nov 2021 Denis Kleyko, Dmitri A. Rachkovskij, Evgeny Osipov, Abbas Rahimi

Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and vector distributed representations.

Electrical Engineering

Binarization Methods for Motor-Imagery Brain-Computer Interface Classification

no code implementations14 Oct 2020 Michael Hersche, Luca Benini, Abbas Rahimi

Our first method, based on sparse bipolar random projection, projects a large number of real-valued Riemannian covariance features to a binary space, where a linear SVM classifier can be learned with binary weights too.

Binarization Classification +2

Robust High-dimensional Memory-augmented Neural Networks

no code implementations5 Oct 2020 Geethan Karunaratne, Manuel Schmuck, Manuel Le Gallo, Giovanni Cherubini, Luca Benini, Abu Sebastian, Abbas Rahimi

Traditional neural networks require enormous amounts of data to build their complex mappings during a slow training procedure that hinders their abilities for relearning and adapting to new data.

Few-Shot Image Classification Vocal Bursts Intensity Prediction

In-memory hyperdimensional computing

no code implementations4 Jun 2019 Geethan Karunaratne, Manuel Le Gallo, Giovanni Cherubini, Luca Benini, Abbas Rahimi, Abu Sebastian

Hyperdimensional computing (HDC) is an emerging computational framework that takes inspiration from attributes of neuronal circuits such as hyperdimensionality, fully distributed holographic representation, and (pseudo)randomness.

Attribute Classification +4

Hyperdimensional Computing Nanosystem

no code implementations23 Nov 2018 Abbas Rahimi, Tony F. Wu, Haitong Li, Jan M. Rabaey, H. -S. Philip Wong, Max M. Shulaker, Subhasish Mitra

By exploiting the unique properties of the underlying nanotechnologies, we show that HD computing, when implemented with monolithic 3D integration, can be up to 420X more energy-efficient while using 25X less area compared to traditional silicon CMOS implementations.

One-shot Learning for iEEG Seizure Detection Using End-to-end Binary Operations: Local Binary Patterns with Hyperdimensional Computing

no code implementations6 Sep 2018 Alessio Burrello, Kaspar Schindler, Luca Benini, Abbas Rahimi

This paper presents an efficient binarized algorithm for both learning and classification of human epileptic seizures from intracranial electroencephalography (iEEG).

One-Shot Learning Seizure Detection +3

Hardware Optimizations of Dense Binary Hyperdimensional Computing: Rematerialization of Hypervectors, Binarized Bundling, and Combinational Associative Memory

1 code implementation20 Jul 2018 Manuel Schmuck, Luca Benini, Abbas Rahimi

In this paper, we propose hardware techniques for optimizations of HD computing, in a synthesizable VHDL library, to enable co-located implementation of both learning and classification tasks on only a small portion of Xilinx(R) UltraScale(TM) FPGAs: (1) We propose simple logical operations to rematerialize the hypervectors on the fly rather than loading them from memory.

Fast and Accurate Multiclass Inference for MI-BCIs Using Large Multiscale Temporal and Spectral Features

2 code implementations18 Jun 2018 Michael Hersche, Tino Rellstab, Pasquale Davide Schiavone, Lukas Cavigelli, Luca Benini, Abbas Rahimi

Accurate, fast, and reliable multiclass classification of electroencephalography (EEG) signals is a challenging task towards the development of motor imagery brain-computer interface (MI-BCI) systems.

Classification EEG +1

Autoscaling Bloom Filter: Controlling Trade-off Between True and False Positives

1 code implementation10 May 2017 Denis Kleyko, Abbas Rahimi, Ross W. Gayler, Evgeny Osipov

A Bloom filter is a simple data structure supporting membership queries on a set.

Data Structures and Algorithms

Cannot find the paper you are looking for? You can Submit a new open access paper.