Latent Gaussian copula models provide a powerful means to perform multi-view data integration since these models can seamlessly express dependencies between mixed variable types (binary, continuous, zero-inflated) via latent Gaussian correlations.
We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework.
A distinct advantage of JACA is that it can be applied to the multi-view data with block-missing structure, that is to cases where a subset of views or class labels is missing for some subjects.
We consider the problem of high-dimensional classification between the two groups with unequal covariance matrices.
The abundance of high-dimensional data in the modern sciences has generated tremendous interest in penalized estimators such as the lasso, scaled lasso, square-root lasso, elastic net, and many others.
This article considers the problem of multi-group classification in the setting where the number of variables $p$ is larger than the number of observations $n$.
We investigate the difference between using an $\ell_1$ penalty versus an $\ell_1$ constraint in generalized eigenvalue problems, such as principal component analysis and discriminant analysis.
Secondly, we propose an extension of this form to the $p\gg N$ setting and achieve feature selection by using a group penalty.
We apply a lasso-type penalty to the discriminant vector to ensure sparsity of the solution and use a shrinkage type estimator for the covariance matrix.