no code implementations • NeurIPS 2020 • Aditya Gangrade, Bobak Nazer, Venkatesh Saligrama
We present novel information-theoretic limits on detecting sparse changes in Isingmodels, a problem that arises in many applications where network changes canoccur due to some external stimuli.
no code implementations • NeurIPS 2019 • Aditya Gangrade, Praveen Venkatesh, Bobak Nazer, Venkatesh Saligrama
Overall, for large changes, $s \gg \sqrt{n}$, we need only $\mathrm{SNR}= O(1)$ whereas a na\"ive test based on community recovery with $O(s)$ errors requires $\mathrm{SNR}= \Theta(\log n)$.
no code implementations • 29 Nov 2018 • Aditya Gangrade, Praveen Venkatesh, Bobak Nazer, Venkatesh Saligrama
Overall, for large changes, $s \gg \sqrt{n}$, we need only $\mathrm{SNR}= O(1)$ whereas a na\"ive test based on community recovery with $O(s)$ errors requires $\mathrm{SNR}= \Theta(\log n)$.
no code implementations • 28 Oct 2017 • Aditya Gangrade, Bobak Nazer, Venkatesh Saligrama
We study the trade-off between the sample sizes and the reliability of change detection, measured as a minimax risk, for the important cases of the Ising models and the Gaussian Markov random fields restricted to the models which have network structures with $p$ nodes and degree at most $d$, and obtain information-theoretic lower bounds for reliable change detection over these models.