Search Results for author: Saeid Haghighatshoar

Found 10 papers, 2 papers with code

Exact Gradient Computation for Spiking Neural Networks Through Forward Propagation

1 code implementation18 Oct 2022 Jane H. Lee, Saeid Haghighatshoar, Amin Karbasi

their weights, and (2) we propose a novel training algorithm, called \emph{forward propagation} (FP), that computes exact gradients for SNN.

EXODUS: Stable and Efficient Training of Spiking Neural Networks

1 code implementation20 May 2022 Felix Christian Bauer, Gregor Lenz, Saeid Haghighatshoar, Sadique Sheik

In this paper, (i) we modify SLAYER and design an algorithm called EXODUS, that accounts for the neuron reset mechanism and applies the Implicit Function Theorem (IFT) to calculate the correct gradients (equivalent to those computed by BPTT), (ii) we eliminate the need for ad-hoc scaling of gradients, thus, reducing the training complexity tremendously, (iii) we demonstrate, via computer simulations, that EXODUS is numerically stable and achieves a comparable or better performance than SLAYER especially in various tasks with SNNs that rely on temporal features.

Machine Learning for Geometrically-Consistent Angular Spread Function Estimation in Massive MIMO

no code implementations30 Oct 2019 Yi Song, Mahdi Barzegar Khalilsarai, Saeid Haghighatshoar, Giuseppe Caire

The modern literature on massive MIMO has recognized that the knowledge of covariance matrix of user channel vectors is very useful for various applications such as hybrid digital analog beamforming, pilot decontamination, etc.

BIG-bench Machine Learning

Multiple Measurement Vectors Problem: A Decoupling Property and its Applications

no code implementations31 Oct 2018 Saeid Haghighatshoar, Giuseppe Caire

Although there is a vast literature on the analysis of MMV, it is not yet fully known how the number of signal samples and their statistical correlations affects the performance of the joint estimation in MMV.

Multi-Band Covariance Interpolation with Applications in Massive MIMO

no code implementations11 Jan 2018 Saeid Haghighatshoar, Mahdi Barzegar Khalilsarai, Giuseppe Caire

In this paper, we show that although this effect is generally negligible for a small number of antennas $M$, it results in a considerable distortion of the covariance matrix and especially its dominant signal subspace in the massive MIMO regime where $M \to \infty$, and can generally incur a serious degradation of the performance especially in frequency division duplexing (FDD) massive MIMO systems where the uplink (UL) and the downlink (DL) communication occur over different frequency bands.

Signal Recovery from Unlabeled Samples

no code implementations30 Jan 2017 Saeid Haghighatshoar, Giuseppe Caire

In this paper, we study the recovery of a signal from a set of noisy linear projections (measurements), when such projections are unlabeled, that is, the correspondence between the measurements and the set of projection vectors (i. e., the rows of the measurement matrix) is not known a priori.

Channel Vector Subspace Estimation from Low-Dimensional Projections

no code implementations24 Sep 2015 Saeid Haghighatshoar, Giuseppe Caire

Massive MIMO is a variant of multiuser MIMO where the number of base-station antennas $M$ is very large (typically 100), and generally much larger than the number of spatially multiplexed data streams (typically 10).

Multi Terminal Probabilistic Compressed Sensing

no code implementations11 Jan 2014 Saeid Haghighatshoar

It is observed that by spatially coupling the measurement matrices, the rate-distortion curve of MAMP algorithm undergoes a phase transition, where the measurement rate region corresponding to a low distortion (approximately zero distortion) regime is fully characterized by the joint and conditional Renyi information dimension (RID) of the multi-terminal source.

A Fast Hadamard Transform for Signals with Sub-linear Sparsity in the Transform Domain

no code implementations7 Oct 2013 Robin Scheibler, Saeid Haghighatshoar, Martin Vetterli

A new iterative low complexity algorithm has been presented for computing the Walsh-Hadamard transform (WHT) of an $N$ dimensional signal with a $K$-sparse WHT, where $N$ is a power of two and $K = O(N^\alpha)$, scales sub-linearly in $N$ for some $0 < \alpha < 1$.

Cannot find the paper you are looking for? You can Submit a new open access paper.