no code implementations • 26 Apr 2018 • Greg Ongie, Daniel Pimentel-Alarcón, Laura Balzano, Rebecca Willett, Robert D. Nowak
This approach will succeed in many cases where traditional LRMC is guaranteed to fail because the data are low-rank in the tensorized representation but not in the original representation.
no code implementations • 16 Feb 2022 • Heguang Lin, Mengze Li, Daniel Pimentel-Alarcón, Matthew Malloy
Prior work showed the minimum-volume confidence sets are the level-sets of a discontinuous function defined by an exact p-value.
no code implementations • 22 May 2022 • Usman Mahmood, Daniel Pimentel-Alarcón
This paper introduces {\em fusion subspace clustering}, a novel method to learn low-dimensional structures that approximate large scale yet highly incomplete data.
no code implementations • 27 Mar 2024 • Huanran Li, Daniel Pimentel-Alarcón
This study focuses on addressing the instability issues prevalent in contrastive learning, specifically examining the InfoNCE loss function and its derivatives.
no code implementations • 27 Mar 2024 • Huanran Li, Daniel Pimentel-Alarcón
This paper proposes a novel framework, TransFusion, designed to make the process of contrastive learning more analytical and explainable.