Search Results for author: Sina AlEMohammad

Found 8 papers, 5 papers with code

An Adaptive Tangent Feature Perspective of Neural Networks

1 code implementation29 Aug 2023 Daniel LeJeune, Sina AlEMohammad

In order to better understand feature learning in neural networks, we propose a framework for understanding linear models in tangent feature space where the features are allowed to be transformed during training.

Self-Consuming Generative Models Go MAD

no code implementations4 Jul 2023 Sina AlEMohammad, Josue Casco-Rodriguez, Lorenzo Luzi, Ahmed Imtiaz Humayun, Hossein Babaei, Daniel LeJeune, Ali Siahkoohi, Richard G. Baraniuk

Seismic advances in generative AI algorithms for imagery, text, and other data types has led to the temptation to use synthetic data to train next-generation models.

NeuroView-RNN: It's About Time

no code implementations23 Feb 2022 CJ Barberan, Sina AlEMohammad, Naiming Liu, Randall Balestriero, Richard G. Baraniuk

A key interpretability issue with RNNs is that it is not clear how each hidden state per time step contributes to the decision-making process in a quantitative manner.

Decision Making Time Series +1

Covariate Balancing Methods for Randomized Controlled Trials Are Not Adversarially Robust

no code implementations25 Oct 2021 Hossein Babaei, Sina AlEMohammad, Richard Baraniuk

Covariate balancing methods increase the similarity between the distributions of the two groups' covariates.

Adversarial Attack

NFT-K: Non-Fungible Tangent Kernels

1 code implementation11 Oct 2021 Sina AlEMohammad, Hossein Babaei, CJ Barberan, Naiming Liu, Lorenzo Luzi, Blake Mason, Richard G. Baraniuk

To further contribute interpretability with respect to classification and the layers, we develop a new network as a combination of multiple neural tangent kernels, one to model each layer of the deep neural network individually as opposed to past work which attempts to represent the entire network via a single neural tangent kernel.

Enhanced Recurrent Neural Tangent Kernels for Non-Time-Series Data

2 code implementations9 Dec 2020 Sina AlEMohammad, Randall Balestriero, Zichao Wang, Richard Baraniuk

Kernels derived from deep neural networks (DNNs) in the infinite-width regime provide not only high performance in a range of machine learning tasks but also new theoretical insights into DNN training dynamics and generalization.

Time Series Time Series Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.