Search Results for author: Alex Dimakis

Found 8 papers, 2 papers with code

SVFT: Parameter-Efficient Fine-Tuning with Singular Vectors

1 code implementation30 May 2024 Vijay Lingam, Atula Tejaswi, Aditya Vavre, Aneesh Shetty, Gautham Krishna Gudur, Joydeep Ghosh, Alex Dimakis, Eunsol Choi, Aleksandar Bojchevski, Sujay Sanghavi

Extensive experiments on language and vision benchmarks show that SVFT recovers up to 96% of full fine-tuning performance while training only 0. 006 to 0. 25% of parameters, outperforming existing methods that only recover up to 85% performance using 0. 03 to 0. 8% of the trainable parameter budget.

parameter-efficient fine-tuning

Put Myself in Your Shoes: Lifting the Egocentric Perspective from Exocentric Videos

no code implementations11 Mar 2024 Mi Luo, Zihui Xue, Alex Dimakis, Kristen Grauman

We investigate exocentric-to-egocentric cross-view translation, which aims to generate a first-person (egocentric) view of an actor based on a video recording that captures the actor from a third-person (exocentric) perspective.

Hallucination Translation

Robust Compressed Sensing MR Imaging with Deep Generative Priors

no code implementations NeurIPS Workshop Deep_Invers 2021 Ajil Jalal, Marius Arvinte, Giannis Daras, Eric Price, Alex Dimakis, Jonathan Tamir

The CSGM framework (Bora-Jalal-Price-Dimakis'17) has shown that deep generative priors can be powerful tools for solving inverse problems.

Compressed Sensing with Invertible Generative Models and Dependent Noise

no code implementations23 Oct 2020 Jay Whang, Qi Lei, Alex Dimakis

We study image inverse problems with invertible generative priors, specifically normalizing flow models.

Denoising

Compressed Sensing with Approximate Priors via Conditional Resampling

no code implementations23 Oct 2020 Ajil Jalal, Sushrut Karmalkar, Alex Dimakis, Eric Price

We characterize the measurement complexity of compressed sensing of signals drawn from a known prior distribution, even when the support of the prior is the entire space (rather than, say, sparse vectors).

Diversity

Approximate Probabilistic Inference with Composed Flows

no code implementations28 Sep 2020 Jay Whang, Erik Lindgren, Alex Dimakis

We study the problem of probabilistic inference on the joint distribution defined by a normalizing flow model.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.