Search Results for author: Joshua Yurtsever

Found 3 papers, 0 papers with code

Reducing Retraining by Recycling Parameter-Efficient Prompts

no code implementations10 Aug 2022 Brian Lester, Joshua Yurtsever, Siamak Shakeri, Noah Constant

Parameter-efficient methods are able to use a single frozen pre-trained large language model (LLM) to perform many tasks by learning task-specific soft prompts that modulate model behavior when concatenated to the input text.

Language Modelling Large Language Model

Unrolled, model-based networks for lensless imaging

no code implementations NeurIPS Workshop Deep_Invers 2019 Kristina Monakhova, Joshua Yurtsever, Grace Kuo, Nick Antipa, Kyrollos Yanny, Laura Waller

Various reconstruction methods are explored, on a scale from classic iterative approaches (based on the physical imaging model) to deep learned methods with many learned parameters.

Learned reconstructions for practical mask-based lensless imaging

no code implementations30 Aug 2019 Kristina Monakhova, Joshua Yurtsever, Grace Kuo, Nick Antipa, Kyrollos Yanny, Laura Waller

In this work, we address these limitations using a bounded-compute, trainable neural network to reconstruct the image.

Rolling Shutter Correction

Cannot find the paper you are looking for? You can Submit a new open access paper.