Search Results for author: Manolis C. Tsakiris

Found 18 papers, 3 papers with code

Online Stability Improvement of Groebner Basis Solvers using Deep Learning

no code implementations17 Jan 2024 Wanting Xu, Lan Hu, Manolis C. Tsakiris, Laurent Kneip

Over the past decade, the Gr\"obner basis theory and automatic solver generation have lead to a large number of solutions to geometric vision problems.

Shuffled Multi-Channel Sparse Signal Recovery

no code implementations14 Dec 2022 Taulant Koka, Manolis C. Tsakiris, Michael Muma, Benjamín Béjar Haro

Assuming that we have a sensing matrix for the underlying signals, we show that the problem is equivalent to a structured unlabeled sensing problem, and establish sufficient conditions for unique recovery.

ARCS: Accurate Rotation and Correspondence Search

1 code implementation CVPR 2022 Liangzu Peng, Manolis C. Tsakiris, René Vidal

We first propose a solver, $\texttt{ARCS}$, that i) assumes noiseless point sets in general position, ii) requires only $2$ inliers, iii) uses $O(m\log m)$ time and $O(m)$ space, and iv) can successfully solve the problem even with, e. g., $m, n\approx 10^6$ in about $0. 1$ seconds.

Unlabeled Principal Component Analysis and Matrix Completion

1 code implementation NeurIPS 2021 Yunzhen Yao, Liangzu Peng, Manolis C. Tsakiris

Allowing for missing entries on top of permutations in UPCA leads to the problem of unlabeled matrix completion, for which we derive theory and algorithms of similar flavor.

Matrix Completion

Homomorphic Sensing of Subspace Arrangements

no code implementations9 Jun 2020 Liangzu Peng, Manolis C. Tsakiris

In this paper, we provide tighter and simpler conditions that guarantee the unique recovery for the single-subspace case, extend the result to the case of a subspace arrangement, and show that the unique recovery in a single subspace is locally stable under noise.

Retrieval

Low-rank matrix completion theory via Plucker coordinates

no code implementations26 Apr 2020 Manolis C. Tsakiris

Despite the popularity of low-rank matrix completion, the majority of its theory has been developed under the assumption of random observation patterns, whereas very little is known about the practically relevant case of non-random patterns.

Low-Rank Matrix Completion Open-Ended Question Answering

Linear Regression without Correspondences via Concave Minimization

1 code implementation17 Mar 2020 Liangzu Peng, Manolis C. Tsakiris

Linear regression without correspondences concerns the recovery of a signal in the linear regression setting, where the correspondences between the observations and the linear functionals are unknown.

regression

Results on the algebraic matroid of the determinantal variety

no code implementations12 Feb 2020 Manolis C. Tsakiris

We make progress towards characterizing the algebraic matroid of the determinantal variety defined by the minors of fixed size of a matrix of variables.

Matrix Completion

Finding the Sparsest Vectors in a Subspace: Theory, Algorithms, and Applications

no code implementations20 Jan 2020 Qing Qu, Zhihui Zhu, Xiao Li, Manolis C. Tsakiris, John Wright, René Vidal

The problem of finding the sparsest vector (direction) in a low dimensional subspace can be considered as a homogeneous variant of the sparse recovery problem, which finds applications in robust subspace recovery, dictionary learning, sparse blind deconvolution, and many other problems in signal processing and machine learning.

Dictionary Learning Representation Learning

Dual Principal Component Pursuit: Probability Analysis and Efficient Algorithms

no code implementations24 Dec 2018 Zhihui Zhu, Yifan Wang, Daniel P. Robinson, Daniel Q. Naiman, Rene Vidal, Manolis C. Tsakiris

However, its geometric analysis is based on quantities that are difficult to interpret and are not amenable to statistical analysis.

An algebraic-geometric approach for linear regression without correspondences

no code implementations12 Oct 2018 Manolis C. Tsakiris, Liangzu Peng, Aldo Conca, Laurent Kneip, Yuanming Shi, Hayoung Choi

This naturally leads to a polynomial system of $n$ equations in $n$ unknowns, which contains $\xi^*$ in its root locus.

regression

Theoretical Analysis of Sparse Subspace Clustering with Missing Entries

no code implementations ICML 2018 Manolis C. Tsakiris, Rene Vidal

The main insight that stems from our analysis is that even though the projection induces additional missing entries, this is counterbalanced by the fact that the projected and zero-filled data are in effect incomplete points associated with the union of the corresponding projected subspaces, with respect to which the point being expressed is complete.

Clustering

Hyperplane Clustering Via Dual Principal Component Pursuit

no code implementations ICML 2017 Manolis C. Tsakiris, Rene Vidal

A thorough experimental evaluation reveals that hyperplane learning schemes based on DPCP dramatically improve over the state-of-the-art methods for the case of synthetic data, while are competitive to the state-of-the-art in the case of 3D plane clustering for Kinect data.

Clustering

Dual Principal Component Pursuit

no code implementations15 Oct 2015 Manolis C. Tsakiris, Rene Vidal

We consider the problem of learning a linear subspace from data corrupted by outliers.

Filtrated Spectral Algebraic Subspace Clustering

no code implementations15 Oct 2015 Manolis C. Tsakiris, Rene Vidal

Algebraic Subspace Clustering (ASC) is a simple and elegant method based on polynomial fitting and differentiation for clustering noiseless data drawn from an arbitrary union of subspaces.

Clustering

Algebraic Clustering of Affine Subspaces

no code implementations22 Sep 2015 Manolis C. Tsakiris, Rene Vidal

Using notions from algebraic geometry, we prove that the homogenization trick, which embeds points in a union of affine subspaces into points in a union of linear subspaces, preserves the general position of the points and the transversality of the union of subspaces in the embedded space, thus establishing the correctness of ASC for affine subpaces.

Clustering Position

Filtrated Algebraic Subspace Clustering

no code implementations20 Jun 2015 Manolis C. Tsakiris, Rene Vidal

In the abstract form of the problem, where no noise or other corruptions are present, the data are assumed to lie in general position inside the algebraic variety of a union of subspaces, and the objective is to decompose the variety into its constituent subspaces.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.