no code implementations • 17 Jan 2024 • Wanting Xu, Lan Hu, Manolis C. Tsakiris, Laurent Kneip
Over the past decade, the Gr\"obner basis theory and automatic solver generation have lead to a large number of solutions to geometric vision problems.
no code implementations • 14 Dec 2022 • Taulant Koka, Manolis C. Tsakiris, Michael Muma, Benjamín Béjar Haro
Assuming that we have a sensing matrix for the underlying signals, we show that the problem is equivalent to a structured unlabeled sensing problem, and establish sufficient conditions for unique recovery.
1 code implementation • CVPR 2022 • Liangzu Peng, Manolis C. Tsakiris, René Vidal
We first propose a solver, $\texttt{ARCS}$, that i) assumes noiseless point sets in general position, ii) requires only $2$ inliers, iii) uses $O(m\log m)$ time and $O(m)$ space, and iv) can successfully solve the problem even with, e. g., $m, n\approx 10^6$ in about $0. 1$ seconds.
no code implementations • 6 Oct 2021 • Yunchen Yang, Xinyue Zhang, Tianjiao Ding, Daniel P. Robinson, Rene Vidal, Manolis C. Tsakiris
In this paper, we revisit the problem of local optimization in RANSAC.
1 code implementation • NeurIPS 2021 • Yunzhen Yao, Liangzu Peng, Manolis C. Tsakiris
Allowing for missing entries on top of permutations in UPCA leads to the problem of unlabeled matrix completion, for which we derive theory and algorithms of similar flavor.
no code implementations • 9 Jun 2020 • Liangzu Peng, Manolis C. Tsakiris
In this paper, we provide tighter and simpler conditions that guarantee the unique recovery for the single-subspace case, extend the result to the case of a subspace arrangement, and show that the unique recovery in a single subspace is locally stable under noise.
no code implementations • 26 Apr 2020 • Manolis C. Tsakiris
Despite the popularity of low-rank matrix completion, the majority of its theory has been developed under the assumption of random observation patterns, whereas very little is known about the practically relevant case of non-random patterns.
1 code implementation • 17 Mar 2020 • Liangzu Peng, Manolis C. Tsakiris
Linear regression without correspondences concerns the recovery of a signal in the linear regression setting, where the correspondences between the observations and the linear functionals are unknown.
no code implementations • 12 Feb 2020 • Manolis C. Tsakiris
We make progress towards characterizing the algebraic matroid of the determinantal variety defined by the minors of fixed size of a matrix of variables.
no code implementations • 20 Jan 2020 • Qing Qu, Zhihui Zhu, Xiao Li, Manolis C. Tsakiris, John Wright, René Vidal
The problem of finding the sparsest vector (direction) in a low dimensional subspace can be considered as a homogeneous variant of the sparse recovery problem, which finds applications in robust subspace recovery, dictionary learning, sparse blind deconvolution, and many other problems in signal processing and machine learning.
no code implementations • 24 Dec 2018 • Zhihui Zhu, Yifan Wang, Daniel P. Robinson, Daniel Q. Naiman, Rene Vidal, Manolis C. Tsakiris
However, its geometric analysis is based on quantities that are difficult to interpret and are not amenable to statistical analysis.
no code implementations • 12 Oct 2018 • Manolis C. Tsakiris, Liangzu Peng, Aldo Conca, Laurent Kneip, Yuanming Shi, Hayoung Choi
This naturally leads to a polynomial system of $n$ equations in $n$ unknowns, which contains $\xi^*$ in its root locus.
no code implementations • ICML 2018 • Manolis C. Tsakiris, Rene Vidal
The main insight that stems from our analysis is that even though the projection induces additional missing entries, this is counterbalanced by the fact that the projected and zero-filled data are in effect incomplete points associated with the union of the corresponding projected subspaces, with respect to which the point being expressed is complete.
no code implementations • ICML 2017 • Manolis C. Tsakiris, Rene Vidal
A thorough experimental evaluation reveals that hyperplane learning schemes based on DPCP dramatically improve over the state-of-the-art methods for the case of synthetic data, while are competitive to the state-of-the-art in the case of 3D plane clustering for Kinect data.
no code implementations • 15 Oct 2015 • Manolis C. Tsakiris, Rene Vidal
We consider the problem of learning a linear subspace from data corrupted by outliers.
no code implementations • 15 Oct 2015 • Manolis C. Tsakiris, Rene Vidal
Algebraic Subspace Clustering (ASC) is a simple and elegant method based on polynomial fitting and differentiation for clustering noiseless data drawn from an arbitrary union of subspaces.
no code implementations • 22 Sep 2015 • Manolis C. Tsakiris, Rene Vidal
Using notions from algebraic geometry, we prove that the homogenization trick, which embeds points in a union of affine subspaces into points in a union of linear subspaces, preserves the general position of the points and the transversality of the union of subspaces in the embedded space, thus establishing the correctness of ASC for affine subpaces.
no code implementations • 20 Jun 2015 • Manolis C. Tsakiris, Rene Vidal
In the abstract form of the problem, where no noise or other corruptions are present, the data are assumed to lie in general position inside the algebraic variety of a union of subspaces, and the objective is to decompose the variety into its constituent subspaces.