Search Results for author: Soumyabrata Pal

Found 24 papers, 1 papers with code

Fuzzy Clustering with Similarity Queries

1 code implementation NeurIPS 2021 Wasim Huleihel, Arya Mazumdar, Soumyabrata Pal

In particular, we provide algorithms for fuzzy clustering in this setting that asks $O(\mathsf{poly}(k)\log n)$ similarity queries and run with polynomial-time-complexity, where $n$ is the number of items.

Clustering

Connectivity in Random Annulus Graphs and the Geometric Block Model

no code implementations12 Apr 2018 Sainyam Galhotra, Arya Mazumdar, Soumyabrata Pal, Barna Saha

Our next contribution is in using the connectivity of random annulus graphs to provide necessary and sufficient conditions for efficient recovery of communities for {\em the geometric block model} (GBM).

Community Detection Stochastic Block Model

The Geometric Block Model

no code implementations16 Sep 2017 Sainyam Galhotra, Arya Mazumdar, Soumyabrata Pal, Barna Saha

To capture the inherent geometric features of many community detection problems, we propose to use a new random graph model of communities that we call a Geometric Block Model.

Community Detection Stochastic Block Model

High Dimensional Discrete Integration over the Hypergrid

no code implementations29 Jun 2018 Raj Kumar Maity, Arya Mazumdar, Soumyabrata Pal

Recently Ermon et al. (2013) pioneered a way to practically compute approximations to large scale counting or discrete integration problems by using random hashes.

Vocal Bursts Intensity Prediction

Semisupervised Clustering, AND-Queries and Locally Encodable Source Coding

no code implementations NeurIPS 2017 Arya Mazumdar, Soumyabrata Pal

In this paper, we show that a recently popular model of semisupervised clustering is equivalent to locally encodable source coding.

Clustering Data Compression

Semisupervised Clustering by Queries and Locally Encodable Source Coding

no code implementations31 Mar 2019 Arya Mazumdar, Soumyabrata Pal

In this paper, we show that a recently popular model of semi-supervised clustering is equivalent to locally encodable source coding.

Clustering Data Compression

Same-Cluster Querying for Overlapping Clusters

no code implementations NeurIPS 2019 Wasim Huleihel, Arya Mazumdar, Muriel Médard, Soumyabrata Pal

In this paper, we look at the more practical scenario of overlapping clusters, and provide upper bounds (with algorithms) on the sufficient number of queries.

Sample Complexity of Learning Mixtures of Sparse Linear Regressions

no code implementations30 Oct 2019 Akshay Krishnamurthy, Arya Mazumdar, Andrew Mcgregor, Soumyabrata Pal

In the problem of learning mixtures of linear regressions, the goal is to learn a collection of signal vectors from a sequence of (possibly noisy) linear measurements, where each measurement is evaluated on an unknown signal drawn uniformly from this collection.

Open-Ended Question Answering

Sample Complexity of Learning Mixture of Sparse Linear Regressions

no code implementations NeurIPS 2019 Akshay Krishnamurthy, Arya Mazumdar, Andrew Mcgregor, Soumyabrata Pal

Ourtechniques are quite different from those in the previous work: for the noiselesscase, we rely on a property of sparse polynomials and for the noisy case, we providenew connections to learning Gaussian mixtures and use ideas from the theory of

Open-Ended Question Answering

Algebraic and Analytic Approaches for Parameter Learning in Mixture Models

no code implementations19 Jan 2020 Akshay Krishnamurthy, Arya Mazumdar, Andrew Mcgregor, Soumyabrata Pal

Our second approach uses algebraic and combinatorial tools and applies to binomial mixtures with shared trial parameter $N$ and differing success parameters, as well as to mixtures of geometric distributions.

Recovery of Sparse Signals from a Mixture of Linear Samples

no code implementations ICML 2020 Arya Mazumdar, Soumyabrata Pal

Mixture of linear regressions is a popular learning theoretic model that is used widely to represent heterogeneous data.

Experimental Design

Recovery of sparse linear classifiers from mixture of responses

no code implementations NeurIPS 2020 Venkata Gandikota, Arya Mazumdar, Soumyabrata Pal

We look at a hitherto unstudied problem of query complexity upper bound of recovering all the hyperplanes, especially for the case when the hyperplanes are sparse.

Quantization

Learning User Preferences in Non-Stationary Environments

no code implementations29 Jan 2021 Wasim Huleihel, Soumyabrata Pal, Ofer Shayevitz

One of the main surprising observations in our experiments is the fact our algorithm outperforms other static algorithms even when preferences do not change over time.

Collaborative Filtering Recommendation Systems

Support Recovery of Sparse Signals from a Mixture of Linear Measurements

no code implementations NeurIPS 2021 Venkata Gandikota, Arya Mazumdar, Soumyabrata Pal

In this work, we study the number of measurements sufficient for recovering the supports of all the component vectors in a mixture in both these models.

Support Recovery in Universal One-bit Compressed Sensing

no code implementations19 Jul 2021 Arya Mazumdar, Soumyabrata Pal

With universality, it is known that $\tilde{\Theta}(k^2)$ 1bCS measurements are necessary and sufficient for support recovery (where $k$ denotes the sparsity).

Quantization

Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians

no code implementations2 Sep 2021 Sami Davies, Arya Mazumdar, Soumyabrata Pal, Cyrus Rashtchian

Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory.

Learning Theory

Random Subgraph Detection Using Queries

no code implementations2 Oct 2021 Wasim Huleihel, Arya Mazumdar, Soumyabrata Pal

Specifically, we show that any (possibly randomized) algorithm must make $\mathsf{Q} = \Omega(\frac{n^2}{k^2\chi^4(p||q)}\log^2n)$ adaptive queries (on expectation) to the adjacency matrix of the graph to detect the planted subgraph with probability more than $1/2$, where $\chi^2(p||q)$ is the Chi-Square distance.

Support Recovery in Mixture Models with Sparse Parameters

no code implementations24 Feb 2022 Arya Mazumdar, Soumyabrata Pal

Sparsity of parameter vectors is a natural constraint in variety of settings, and support recovery is a major step towards parameter estimation.

On Learning Mixture of Linear Regressions in the Non-Realizable Setting

no code implementations26 May 2022 Avishek Ghosh, Arya Mazumdar, Soumyabrata Pal, Rajat Sen

In this paper we show that a version of the popular alternating minimization (AM) algorithm finds the best fit lines in a dataset even when a realizable model is not assumed, under some regularity conditions on the dataset and the initial points, and thereby provides a solution for the ERM.

Community Recovery in the Geometric Block Model

no code implementations22 Jun 2022 Sainyam Galhotra, Arya Mazumdar, Soumyabrata Pal, Barna Saha

We show that a simple triangle-counting algorithm to detect communities in the geometric block model is near-optimal.

Community Detection Stochastic Block Model

Online Low Rank Matrix Completion

no code implementations8 Sep 2022 Prateek Jain, Soumyabrata Pal

In each round, the algorithm recommends one item per user, for which it gets a (noisy) reward sampled from a low-rank user-item preference matrix.

Clustering Collaborative Filtering +1

Sample-Efficient Personalization: Modeling User Parameters as Low Rank Plus Sparse Components

no code implementations7 Oct 2022 Soumyabrata Pal, Prateek Varshney, Prateek Jain, Abhradeep Guha Thakurta, Gagan Madan, Gaurav Aggarwal, Pradeep Shenoy, Gaurav Srivastava

We then study the framework in the linear setting, where the problem reduces to that of estimating the sum of a rank-$r$ and a $k$-column sparse matrix using a small number of linear measurements.

Meta-Learning Recommendation Systems

Improved Support Recovery in Universal One-bit Compressed Sensing

no code implementations29 Oct 2022 Namiko Matsumoto, Arya Mazumdar, Soumyabrata Pal

A {\em universal} measurement matrix for 1bCS refers to one set of measurements that work for all sparse signals.

Optimal Algorithms for Latent Bandits with Cluster Structure

no code implementations17 Jan 2023 Soumyabrata Pal, Arun Sai Suggala, Karthikeyan Shanmugam, Prateek Jain

Instead, we propose LATTICE (Latent bAndiTs via maTrIx ComplEtion) which allows exploitation of the latent cluster structure to provide the minimax optimal regret of $\widetilde{O}(\sqrt{(\mathsf{M}+\mathsf{N})\mathsf{T}})$, when the number of clusters is $\widetilde{O}(1)$.

Matrix Completion Recommendation Systems

Cannot find the paper you are looking for? You can Submit a new open access paper.