no code implementations • 31 May 2023 • Subhroshekhar Ghosh, Aaron Y. R. Low, Yong Sheng Soh, Zhuohang Feng, Brendan K. Y. Tan
We apply our paradigm to investigate the dictionary learning problem for the groups SO(2) and SO(3).
1 code implementation • ICLR 2022 • Zhaoqiang Liu, Jiulong Liu, Subhroshekhar Ghosh, Jun Han, Jonathan Scarlett
We perform experiments on various image datasets for spiked matrix and phase retrieval models, and illustrate performance gains of our method to the classic power method and the truncated power method devised for sparse principal component analysis.
no code implementations • 20 Jan 2022 • Subhroshekhar Ghosh, Soumendu Sundar Mukherjee
Group or cluster structure on explanatory variables in machine learning problems is a very general phenomenon, which has attracted broad interest from practitioners and theoreticians alike.
no code implementations • 29 Sep 2021 • Aaron Yi Rui Low, Subhroshekhar Ghosh, Yong Sheng Soh
Thus, a naturally significant class of functions consists of those that are intrinsic to the problem, in the sense of being independent of such base change or relabelling; in other words invariant under the conjugation action by a group.
no code implementations • 8 Aug 2021 • Zhaoqiang Liu, Subhroshekhar Ghosh, Jun Han, Jonathan Scarlett
In 1-bit compressive sensing, each measurement is quantized to a single bit, namely the sign of a linear function of an unknown vector, and the goal is to accurately recover the vector.
1 code implementation • NeurIPS 2021 • Zhaoqiang Liu, Subhroshekhar Ghosh, Jonathan Scarlett
We also adapt this result to sparse phase retrieval, and show that $O(s \log n)$ samples are sufficient for a similar guarantee when the underlying signal is $s$-sparse and $n$-dimensional, matching an information-theoretic lower bound.
no code implementations • 6 May 2021 • Subhroshekhar Ghosh, Meixia Lin, Dongfang Sun
In this work, we investigate spectrogram analysis via an examination of the stochastic geometric properties of their level sets.
no code implementations • 12 Nov 2020 • Sanjay Chaudhuri, Subhroshekhar Ghosh, David J. Nott, Kim Cuc Pham
The expected log-likelihood is then estimated by an empirical likelihood where the only inputs required are a choice of summary statistic, it's observed value, and the ability to simulate the chosen summary statistics for any parameter value under the model.
no code implementations • ICML 2020 • Subhroshekhar Ghosh, Krishnakumar Balasubramanian, Xiaochuan Yang
We propose a novel stochastic network model, called Fractal Gaussian Network (FGN), that embodies well-defined and analytically tractable fractal structures.
no code implementations • 8 Jul 2020 • Rémi Bardenet, Subhroshekhar Ghosh
Our approach is scalable and applies to very general DPPs, beyond traditional symmetric kernels.
no code implementations • 17 Feb 2020 • Subhroshekhar Ghosh, Kumarjit Saha
Stochastic networks based on random point sets as nodes have attracted considerable interest in many applications, particularly in communication networks, including wireless sensor networks, peer-to-peer networks and so on.