Paper

The neighborhood lattice for encoding partial correlations in a Hilbert space

Neighborhood regression has been a successful approach in graphical and structural equation modeling, with applications to learning undirected and directed graphical models. We extend these ideas by defining and studying an algebraic structure called the neighborhood lattice based on a generalized notion of neighborhood regression. We show that this algebraic structure has the potential to provide an economic encoding of all conditional independence statements in a Gaussian distribution (or conditional uncorrelatedness in general), even in the cases where no graphical model exists that could "perfectly" encode all such statements. We study the computational complexity of computing these structures and show that under a sparsity assumption, they can be computed in polynomial time, even in the absence of the assumption of perfectness to a graph. On the other hand, assuming perfectness, we show how these neighborhood lattices may be "graphically" computed using the separation properties of the so-called partial correlation graph. We also draw connections with directed acyclic graphical models and Bayesian networks. We derive these results using an abstract generalization of partial uncorrelatedness, called partial orthogonality, which allows us to use algebraic properties of projection operators on Hilbert spaces to significantly simplify and extend existing ideas and arguments. Consequently, our results apply to a wide range of random objects and data structures, such as random vectors, data matrices, and functions.

Results in Papers With Code
(↓ scroll down to see all results)