no code implementations • 17 Feb 2025 • Tian Jin, Ellie Y. Cheng, Zack Ankner, Nikunj Saunshi, Blake M. Elias, Amir Yazdanbakhsh, Jonathan Ragan-Kelley, Suvinay Subramanian, Michael Carbin
We present PASTA, a learning-based system that teaches LLMs to identify semantic independence and express parallel decoding opportunities in their own responses.
no code implementations • 21 Aug 2024 • Ellie Y. Cheng, Eric Atkinson, Guillaume Baudart, Louis Mandel, Michael Carbin
In this work, we present inference plans, a programming interface that enables developers to control the partitioning of random variables during hybrid particle filtering.
no code implementations • 19 Oct 2021 • Ellie Y. Cheng, Todd Millstein, Guy Van Den Broeck, Steven Holtzen
Many of today's probabilistic programming languages (PPLs) have brittle inference performance: the performance of the underlying inference algorithm is very sensitive to the precise way in which the probabilistic program is written.