no code implementations • 21 Jan 2013 • Irina Gaynanova, James G. Booth, Martin T. Wells
We apply a lasso-type penalty to the discriminant vector to ensure sparsity of the solution and use a shrinkage type estimator for the covariance matrix.
no code implementations • 24 Mar 2014 • Irina Gaynanova, James G. Booth, Martin T. Wells
Secondly, we propose an extension of this form to the $p\gg N$ setting and achieve feature selection by using a group penalty.
no code implementations • 22 Oct 2014 • Irina Gaynanova, James Booth, Martin T. Wells
We investigate the difference between using an $\ell_1$ penalty versus an $\ell_1$ constraint in generalized eigenvalue problems, such as principal component analysis and discriminant analysis.
no code implementations • 23 Nov 2014 • Irina Gaynanova, Mladen Kolar
This article considers the problem of multi-group classification in the setting where the number of variables $p$ is larger than the number of observations $n$.
1 code implementation • 22 Apr 2016 • Jacob Bien, Irina Gaynanova, Johannes Lederer, Christian Müller
The TREX is a recently introduced method for performing sparse high-dimensional regression.
no code implementations • 1 Aug 2016 • Johannes Lederer, Lu Yu, Irina Gaynanova
The abundance of high-dimensional data in the modern sciences has generated tremendous interest in penalized estimators such as the lasso, scaled lasso, square-root lasso, elastic net, and many others.
no code implementations • 20 Jul 2017 • Irina Gaynanova, Gen Li
We call this model SLIDE for Structural Learning and Integrative DEcomposition of multi-view data.
1 code implementation • 13 Nov 2017 • Irina Gaynanova, Tianying Wang
We consider the problem of high-dimensional classification between the two groups with unequal covariance matrices.
no code implementations • 20 Nov 2018 • Yunfeng Zhang, Irina Gaynanova
A distinct advantage of JACA is that it can be applied to the multi-view data with block-missing structure, that is to cases where a subset of views or class labels is missing for some subjects.
1 code implementation • 12 Feb 2019 • Alexander F. Lapanowski, Irina Gaynanova
We consider the two-group classification problem and propose a kernel classifier based on the optimal scoring framework.
no code implementations • 8 May 2020 • Alexander F. Lapanowski, Irina Gaynanova
Large-sample data became prevalent as data acquisition became cheaper and easier.
2 code implementations • 24 Jun 2020 • Grace Yoon, Christian L. Müller, Irina Gaynanova
Latent Gaussian copula models provide a powerful means to perform multi-view data integration since these models can seamlessly express dependencies between mixed variable types (binary, continuous, zero-inflated) via latent Gaussian correlations.
Computation Methodology
1 code implementation • 7 May 2021 • Dongbang Yuan, Irina Gaynanova
We consider the problem of extracting joint and individual signals from multi-view data, that is data collected from different sources on matched samples.
1 code implementation • 9 Sep 2022 • Renat Sergazinov, Mohammadreza Armandpour, Irina Gaynanova
Deep learning models achieve state-of-the art results in predicting blood glucose trajectories, with a wide range of architectures being proposed.