no code implementations • 16 Dec 2022 • Antoine Ledent, Rodrigo Alves, Yunwen Lei, Yann Guermeur, Marius Kloft
We study inductive matrix completion (matrix completion with side information) under an i. i. d.
no code implementations • NeurIPS 2021 • Robert A. Vandermeulen, Antoine Ledent
In this paper we investigate the theoretical implications of incorporating a multi-view latent variable model, a type of low-rank model, into nonparametric density estimation.
no code implementations • NeurIPS 2021 • Antoine Ledent, Rodrigo Alves, Yunwen Lei, Marius Kloft
In this paper, we bridge the gap between the state-of-the-art theoretical results for matrix completion with the nuclear norm and their equivalent in \textit{inductive matrix completion}: (1) In the distribution-free setting, we prove bounds improving the previously best scaling of $O(rd^2)$ to $\widetilde{O}(d^{3/2}\sqrt{r})$, where $d$ is the dimension of the side information and $r$ is the rank.
1 code implementation • 21 Sep 2021 • Saurabh Varshneya, Antoine Ledent, Robert A. Vandermeulen, Yunwen Lei, Matthias Enders, Damian Borth, Marius Kloft
We propose a novel training methodology -- Concept Group Learning (CGL) -- that encourages training of interpretable CNN filters by partitioning filters in each layer into concept groups, each of which is trained to learn a single visual concept.
no code implementations • 31 May 2021 • Waleed Mustafa, Yunwen Lei, Antoine Ledent, Marius Kloft
Existing generalization analysis implies generalization bounds with at least a square-root dependency on the cardinality $d$ of the label set, which can be vacuous in practice.
no code implementations • 29 Apr 2021 • Liang Wu, Antoine Ledent, Yunwen Lei, Marius Kloft
In this paper, we initiate the generalization analysis of regularized vector-valued learning algorithms by presenting bounds with a mild dependency on the output dimension and a fast rate on the sample size.
Extreme Multi-Label Classification General Classification +2
no code implementations • NeurIPS 2020 • Yunwen Lei, Antoine Ledent, Marius Kloft
Pairwise learning refers to learning tasks with loss functions depending on a pair of training examples, which includes ranking and metric learning as specific examples.
no code implementations • 3 Apr 2020 • Antoine Ledent, Rodrigo Alves, Marius Kloft
We propose orthogonal inductive matrix completion (OMIC), an interpretable approach to matrix completion based on a sum of multiple orthonormal side information terms, together with nuclear-norm regularization.
no code implementations • 29 May 2019 • Antoine Ledent, Waleed Mustafa, Yunwen Lei, Marius Kloft
This holds even when formulating the bounds in terms of the $L^2$-norm of the weight matrices, where previous bounds exhibit at least a square-root dependence on the number of classes.