2 code implementations • 10 Apr 2024 • Xianlu Li, Nicolas Nadisic, Shaoguang Huang, Aleksandra Pižurica
By unfolding iterative optimization methods into neural networks, this approach offers enhanced interpretability and reliability compared to data-driven deep learning methods, and greater adaptability and generalization than model-based approaches.
1 code implementation • 18 Dec 2023 • Yuming Qiu, Aleksandra Pizurica, Qi Ming, Nicolas Nadisic
In addition, we especially built a dataset named SML2023 containing hundreds of scatter images with different markers and various levels of overlapping severity, and tested the proposed method and compared it to existing methods.
1 code implementation • 11 Oct 2021 • Nicolas Nadisic, Nicolas Gillis, Christophe Kervazo
More recently, Bhattacharyya and Kannan (ACM-SIAM Symposium on Discrete Algorithms, 2020) proposed an algorithm for learning a latent simplex (ALLS) that relies on the assumption that there is more than one nearby data point to each vertex.
1 code implementation • 22 Nov 2020 • Nicolas Nadisic, Jeremy E Cohen, Arnaud Vandaele, Nicolas Gillis
In this paper, as opposed to most previous works that enforce sparsity column- or row-wise, we first introduce a novel formulation for sparse MNNLS, with a matrix-wise sparsity constraint.
1 code implementation • 13 Jun 2020 • Nicolas Nadisic, Arnaud Vandaele, Jeremy E. Cohen, Nicolas Gillis
We propose a new variant of nonnegative matrix factorization (NMF), combining separability and sparsity assumptions.