Search Results for author: Sang-Yun Oh

Found 9 papers, 3 papers with code

Learning Gaussian Graphical Models with Latent Confounders

no code implementations14 May 2021 Ke Wang, Alexander Franks, Sang-Yun Oh

In this paper, we compare and contrast two strategies for inference in graphical models with latent confounders: Gaussian graphical models with latent variables (LVGGM) and PCA-based removal of confounding (PCA+GGM).

Endogenous Representation of Asset Returns

no code implementations25 Oct 2020 Zhipu Zhou, Alexander Shkolnik, Sang-Yun Oh

Our results point to the possibility that most of the risk in equity markets may be explained by a sparse network of interacting assets (or their issuing firms).

Partial Separability and Functional Graphical Models for Multivariate Gaussian Processes

1 code implementation7 Oct 2019 Javier Zapata, Sang-Yun Oh, Alexander Petersen

Next, the partial separability structure is shown to be particularly useful in order to provide a well-defined functional Gaussian graphical model that can be identified with a sequence of finite-dimensional graphical models, each of identical fixed dimension.

Gaussian Processes

RRNet: Repetition-Reduction Network for Energy Efficient Decoder of Depth Estimation

no code implementations23 Jul 2019 Sang-Yun Oh, Hye-Jin S. Kim, Jongeun Lee, Junmo Kim

We introduce Repetition-Reduction network (RRNet) for resource-constrained depth estimation, offering significantly improved efficiency in terms of computation, memory and energy consumption.

Depth Estimation

Distributionally Robust Formulation and Model Selection for the Graphical Lasso

1 code implementation22 May 2019 Pedro Cisneros-Velarde, Sang-Yun Oh, Alexander Petersen

As a consequence of this formulation, the radius of the Wasserstein ambiguity set is directly related to the regularization parameter in the estimation problem.

Model Selection

Communication-Avoiding Optimization Methods for Distributed Massive-Scale Sparse Inverse Covariance Estimation

1 code implementation30 Oct 2017 Penporn Koanantakool, Alnur Ali, Ariful Azad, Aydin Buluc, Dmitriy Morozov, Leonid Oliker, Katherine Yelick, Sang-Yun Oh

Across a variety of scientific disciplines, sparse inverse covariance estimation is a popular tool for capturing the underlying dependency relationships in multivariate data.

Clustering

Optimization Methods for Sparse Pseudo-Likelihood Graphical Model Selection

no code implementations NeurIPS 2014 Sang-Yun Oh, Onkar Dalal, Kshitij Khare, Bala Rajaratnam

In direct contrast to the parallel work in the Gaussian setting however, this new convex pseudo-likelihood framework has not leveraged the extensive array of methods that have been proposed in the machine learning literature for convex optimization.

BIG-bench Machine Learning Model Selection

A convex pseudo-likelihood framework for high dimensional partial correlation estimation with convergence guarantees

no code implementations20 Jul 2013 Kshitij Khare, Sang-Yun Oh, Bala Rajaratnam

As none of the popular methods proposed for solving pseudo-likelihood based objective functions have provable convergence guarantees, it is not clear if corresponding estimators exist or are even computable, or if they actually yield correct partial correlation graphs.

Model Selection regression

Cannot find the paper you are looking for? You can Submit a new open access paper.