Search Results for author: Matthew Muckley

Found 4 papers, 3 papers with code

Residual Quantization with Implicit Neural Codebooks

1 code implementation26 Jan 2024 Iris Huijben, Matthijs Douze, Matthew Muckley, Ruud Van Sloun, Jakob Verbeek

In this paper, we propose QINCo, a neural RQ variant which predicts specialized codebooks per vector using a neural network that is conditioned on the approximation of the vector from previous steps.

Data Compression Quantization

Latent Discretization for Continuous-time Sequence Compression

no code implementations28 Dec 2022 Ricky T. Q. Chen, Matthew Le, Matthew Muckley, Maximilian Nickel, Karen Ullrich

We empirically verify our approach on multiple domains involving compression of video and motion capture sequences, showing that our approaches can automatically achieve reductions in bit rates by learning how to discretize.

On learning adaptive acquisition policies for undersampled multi-coil MRI reconstruction

1 code implementation30 Mar 2022 Tim Bakker, Matthew Muckley, Adriana Romero-Soriano, Michal Drozdzal, Luis Pineda

Most current approaches to undersampled multi-coil MRI reconstruction focus on learning the reconstruction model for a fixed, equidistant acquisition trajectory.

MRI Reconstruction SSIM

COVID-19 Prognosis via Self-Supervised Representation Learning and Multi-Image Prediction

1 code implementation13 Jan 2021 Anuroop Sriram, Matthew Muckley, Koustuv Sinha, Farah Shamout, Joelle Pineau, Krzysztof J. Geras, Lea Azour, Yindalon Aphinyanaphongs, Nafissa Yakubova, William Moore

The first is deterioration prediction from a single image, where our model achieves an area under receiver operating characteristic curve (AUC) of 0. 742 for predicting an adverse event within 96 hours (compared to 0. 703 with supervised pretraining) and an AUC of 0. 765 for predicting oxygen requirements greater than 6 L a day at 24 hours (compared to 0. 749 with supervised pretraining).

Representation Learning Self-Supervised Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.