no code implementations • 12 Apr 2021 • H. Narayanan, Hariharan Narayanan
This version of the theorem states that `stationarity' (derivative zero condition) of power transfer occurs when the multiport is terminated by its adjoint, provided the resulting network has a solution.
no code implementations • 28 Jan 2021 • Hariharan Narayanan, Rikhav Shah, Nikhil Srivastava
We prove upper bounds on the graph diameters of polytopes in two settings.
Combinatorics Discrete Mathematics Functional Analysis Optimization and Control Probability
no code implementations • 13 Apr 2020 • Somnath Chakraborty, Hariharan Narayanan
Suppose that we are given independent, identically distributed samples $x_l$ from a mixture $\mu$ of no more than $k$ of $d$-dimensional spherical gaussian distributions $\mu_i$ with variance $1$, such that the minimum $\ell_2$ distance between two distinct centers $y_l$ and $y_j$ is greater than $\sqrt{d} \Delta$ for some $c \leq \Delta $, where $c\in (0, 1)$ is a small positive universal constant.
no code implementations • 29 Mar 2018 • Adam Gustafson, Matthew Hirn, Kitty Mohammed, Hariharan Narayanan, Jason Xu
Recently, the following smooth function approximation problem was proposed: given a finite set $E \subset \mathbb{R}^d$ and a function $f: E \rightarrow \mathbb{R}$, interpolate the given information with a function $\widehat{f} \in \dot{C}^{1, 1}(\mathbb{R}^d)$ (the class of first-order differentiable functions with Lipschitz gradients) such that $\widehat{f}(a) = f(a)$ for all $a \in E$, and the value of $\mathrm{Lip}(\nabla \widehat{f})$ is minimal.
no code implementations • 6 Mar 2018 • Adam Gustafson, Hariharan Narayanan
We present an affine-invariant random walk for drawing uniform random samples from a convex body $\mathcal{K} \subset \mathbb{R}^n$ that uses maximum volume inscribed ellipsoids, known as John's ellipsoids, for the proposal distribution.
no code implementations • 11 Sep 2017 • Kitty Mohammed, Hariharan Narayanan
Ideally, the estimate $\mathcal{M}_\mathrm{put}$ of $\mathcal{M}$ should be an actual manifold of a certain smoothness; furthermore, $\mathcal{M}_\mathrm{put}$ should be arbitrarily close to $\mathcal{M}$ in Hausdorff distance given a large enough sample.
no code implementations • 28 Jan 2015 • Alexandre Belloni, Tengyuan Liang, Hariharan Narayanan, Alexander Rakhlin
We consider the problem of optimizing an approximately convex function over a bounded convex set in $\mathbb{R}^n$ using only function evaluations.
no code implementations • 11 Feb 2014 • Tengyuan Liang, Hariharan Narayanan, Alexander Rakhlin
The method is based on a random walk (the \emph{Ball Walk}) on the epigraph of the function.
no code implementations • 23 Sep 2013 • Hariharan Narayanan, Alexander Rakhlin
Within the context of exponential families, the proposed method produces samples from a posterior distribution which is updated as data arrive in a streaming fashion.
no code implementations • NeurIPS 2010 • Hariharan Narayanan, Sanjoy Mitter
Given upper bounds on the dimension, volume, and curvature, we show that Empirical Risk Minimization can produce a nearly optimal manifold using a number of random samples that is {\it independent} of the ambient dimension of the space in which data lie.
no code implementations • NeurIPS 2010 • Hariharan Narayanan, Alexander Rakhlin
We propose a computationally efficient random walk on a convex body which rapidly mixes to a time-varying Gibbs distribution.