no code implementations • 10 Dec 2023 • Luke McDermott, Jason Weitz, Dmitri Demler, Daniel Cummings, Nhan Tran, Javier Duarte
We develop an automated pipeline to streamline neural architecture codesign for fast, real-time Bragg peak analysis in high-energy diffraction microscopy.
no code implementations • 28 Oct 2023 • Luke McDermott, Daniel Cummings
We find that distilled data, a synthetic summarization of the real data, paired with Iterative Magnitude Pruning (IMP) unveils a new class of sparse networks that are more stable to SGD noise on the real data, than either the dense model, or subnetworks found with real data in IMP.
no code implementations • 28 Oct 2023 • Jennifer Crawford, Haoli Yin, Luke McDermott, Daniel Cummings
Multimodal Re-Identification (ReID) is a popular retrieval task that aims to re-identify objects across diverse data streams, prompting many researchers to integrate multiple modalities into a unified representation.
no code implementations • 25 Oct 2023 • Haoli Yin, Jiayao Li, Eva Schiller, Luke McDermott, Daniel Cummings
Object Re-Identification (ReID) is pivotal in computer vision, witnessing an escalating demand for adept multimodal representation learning.
no code implementations • 28 Aug 2023 • Brad Larson, Bishal Upadhyaya, Luke McDermott, Siddha Ganju
Structured pruning asserts that while large networks enable us to find solutions to complex computer vision problems, a smaller, computationally efficient sub-network can be derived from the large neural network that retains model accuracy but significantly improves computational efficiency.
1 code implementation • 7 Jul 2023 • Luke McDermott, Daniel Cummings
This work introduces a novel approach to pruning deep learning models by using distilled data.