Search Results for author: Mohammadreza Soltani

Found 26 papers, 8 papers with code

Toward Data-Driven STAP Radar

no code implementations26 Jan 2022 Shyam Venkatasubramanian, Chayut Wongkamthong, Mohammadreza Soltani, Bosung Kang, Sandeep Gogineni, Ali Pezeshki, Muralidhar Rangaswamy, Vahid Tarokh

In this regard, we will generate a large, representative adaptive radar signal processing database for training and testing, analogous in spirit to the COCO dataset for natural images.

object-detection Object Detection +1

On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections

no code implementations26 Jan 2022 Mohammadreza Soltani, Suya Wu, Yuerong Li, Jie Ding, Vahid Tarokh

We propose a new structured pruning framework for compressing Deep Neural Networks (DNNs) with skip connections, based on measuring the statistical dependency of hidden layers and predicted outputs.

Benchmarking Data-driven Surrogate Simulators for Artificial Electromagnetic Materials

1 code implementation NeurIPS 2021 Yang Deng*, Juncheng Dong*, Simiao Ren*, Omar Khatib, Mohammadreza Soltani, Vahid Tarokh, Willie Padilla, Jordan Malof

Recently, it has been shown that deep learning can be an alternative solution to infer the relationship between an AEM geometry and its properties using a (relatively) small pool of CEMS data.

Benchmarking Neural Network simulation

Task Affinity with Maximum Bipartite Matching in Few-Shot Learning

1 code implementation ICLR 2022 Cat P. Le, Juncheng Dong, Mohammadreza Soltani, Vahid Tarokh

We propose an asymmetric affinity score for representing the complexity of utilizing the knowledge of one task for learning another one.

Few-Shot Learning

A Methodology for Exploring Deep Convolutional Features in Relation to Hand-Crafted Features with an Application to Music Audio Modeling

1 code implementation31 May 2021 Anna K. Yanchenko, Mohammadreza Soltani, Robert J. Ravier, Sayan Mukherjee, Vahid Tarokh

In this work, we instead take the perspective of relating deep features to well-studied, hand-crafted features that are meaningful for the application of interest.

Feature Importance

Fisher Task Distance and Its Application in Neural Architecture Search

1 code implementation23 Mar 2021 Cat P. Le, Mohammadreza Soltani, Juncheng Dong, Vahid Tarokh

Next, we construct an online neural architecture search framework using the Fisher task distance, in which we have access to the past learned tasks.

Neural Architecture Search Transfer Learning

Improved Automated Machine Learning from Transfer Learning

1 code implementation27 Feb 2021 Cat P. Le, Mohammadreza Soltani, Robert Ravier, Vahid Tarokh

In this paper, we propose a neural architecture search framework based on a similarity measure between some baseline tasks and a target task.

BIG-bench Machine Learning Neural Architecture Search +1

Model-Free Energy Distance for Pruning DNNs

1 code implementation1 Jan 2021 Mohammadreza Soltani, Suya Wu, Yuerong Li, Jie Ding, Vahid Tarokh

We measure a new model-free information between the feature maps and the output of the network.

Task-Aware Neural Architecture Search

1 code implementation27 Oct 2020 Cat P. Le, Mohammadreza Soltani, Robert Ravier, Vahid Tarokh

The design of handcrafted neural networks requires a lot of time and resources.

Neural Architecture Search

Projected Latent Markov Chain Monte Carlo: Conditional Sampling of Normalizing Flows

no code implementations ICLR 2021 Chris Cannella, Mohammadreza Soltani, Vahid Tarokh

We introduce Projected Latent Markov Chain Monte Carlo (PL-MCMC), a technique for sampling from the high-dimensional conditional distributions learned by a normalizing flow.

GeoStat Representations of Time Series for Fast Classification

no code implementations13 Jul 2020 Robert J. Ravier, Mohammadreza Soltani, Miguel Simões, Denis Garagic, Vahid Tarokh

GeoStat representations are based off of a generalization of recent methods for trajectory classification, and summarize the information of a time series in terms of comprehensive statistics of (possibly windowed) distributions of easy to compute differential geometric quantities, requiring no dynamic time warping.

Classification Dynamic Time Warping +4

Hyperparameter Optimization in Neural Networks via Structured Sparse Recovery

no code implementations7 Jul 2020 Minsu Cho, Mohammadreza Soltani, Chinmay Hegde

In this paper, we study two important problems in the automated design of neural networks -- Hyper-parameter Optimization (HPO), and Neural Architecture Search (NAS) -- through the lens of sparse recovery methods.

Hyperparameter Optimization Neural Architecture Search

Perception-Distortion Trade-off with Restricted Boltzmann Machines

no code implementations21 Oct 2019 Chris Cannella, Jie Ding, Mohammadreza Soltani, Vahid Tarokh

In this work, we introduce a new procedure for applying Restricted Boltzmann Machines (RBMs) to missing data inference tasks, based on linearization of the effective energy function governing the distribution of observations.

Unsupervised Demixing of Structured Signals from Their Superposition Using GANs

no code implementations ICLR Workshop DeepGenStruct 2019 Mohammadreza Soltani, Swayambhoo Jain, Abhinav Sambasivan

In this paper, we consider the observation setting in which the samples from a target distribution are given by the superposition of two structured components, and leverage GANs for learning of the structure of the components.

Fast Low-Rank Matrix Estimation without the Condition Number

no code implementations8 Dec 2017 Mohammadreza Soltani, Chinmay Hegde

In this paper, we provide a novel algorithmic framework that achieves the best of both worlds: asymptotically as fast as factorization methods, while requiring no dependency on the condition number.

Demixing Structured Superposition Signals from Periodic and Aperiodic Nonlinear Observations

no code implementations8 Aug 2017 Mohammadreza Soltani, Chinmay Hegde

We consider the demixing problem of two (or more) structured high-dimensional vectors from a limited number of nonlinear observations where this nonlinearity is due to either a periodic or an aperiodic function.

Fast Algorithms for Learning Latent Variables in Graphical Models

no code implementations27 Jun 2017 Mohammadreza Soltani, Chinmay Hegde

Existing methods for this problem assume that the precision matrix of the observed variables is the superposition of a sparse and a low-rank component.

Improved Algorithms for Matrix Recovery from Rank-One Projections

no code implementations21 May 2017 Mohammadreza Soltani, Chinmay Hegde

We consider the problem of estimation of a low-rank matrix from a limited number of noisy rank-one projections.

Iterative Thresholding for Demixing Structured Superpositions in High Dimensions

no code implementations23 Jan 2017 Mohammadreza Soltani, Chinmay Hegde

Specifically, we show that for certain types of structured superposition models, our method provably recovers the components given merely $n = \mathcal{O}(s)$ samples where $s$ denotes the number of nonzero entries in the underlying components.

Vocal Bursts Intensity Prediction

Stable Recovery Of Sparse Vectors From Random Sinusoidal Feature Maps

no code implementations23 Jan 2017 Mohammadreza Soltani, Chinmay Hegde

Random sinusoidal features are a popular approach for speeding up kernel-based inference in large datasets.

Dimensionality Reduction

Fast Algorithms for Demixing Sparse Signals from Nonlinear Observations

no code implementations3 Aug 2016 Mohammadreza Soltani, Chinmay Hegde

We study the problem of demixing a pair of sparse signals from noisy, nonlinear observations of their superposition.

Astronomy

Cannot find the paper you are looking for? You can Submit a new open access paper.