Search Results for author: Vidushi Vashishth

Found 2 papers, 1 papers with code

DynaQuant: Compressing Deep Learning Training Checkpoints via Dynamic Quantization

no code implementations20 Jun 2023 Amey Agrawal, Sameer Reddy, Satwik Bhattamishra, Venkata Prabhakara Sarath Nookala, Vidushi Vashishth, Kexin Rong, Alexey Tumanov

With the increase in the scale of Deep Learning (DL) training workloads in terms of compute resources and time consumption, the likelihood of encountering in-training failures rises substantially, leading to lost work and resource wastage.

Model Compression Quantization +1

Beyond Prompts: Exploring the Design Space of Mixed-Initiative Co-Creativity Systems

1 code implementation3 May 2023 Zhiyu Lin, Upol Ehsan, Rohan Agarwal, Samihan Dani, Vidushi Vashishth, Mark Riedl

We find out that MI-CC systems with more extensive coverage of the design space are rated higher or on par on a variety of creative and goal-completion metrics, demonstrating that wider coverage of the design space can improve user experience and achievement when using the system; Preference varies greatly between expertise groups, suggesting the development of adaptive, personalized MI-CC systems; Participants identified new design space dimensions including scrutability -- the ability to poke and prod at models -- and explainability.

Cannot find the paper you are looking for? You can Submit a new open access paper.