Search Results for author: David Broman

Found 7 papers, 5 papers with code

Learning Formal Mathematics From Intrinsic Motivation

2 code implementations30 Jun 2024 Gabriel Poesia, David Broman, Nick Haber, Noah D. Goodman

We propose novel methods for hindsight relabeling on proof search trees to significantly improve the agent's sample efficiency in both tasks.

Automated Theorem Proving Language Modelling +1

Optimizing Instructions and Demonstrations for Multi-Stage Language Model Programs

1 code implementation17 Jun 2024 Krista Opsahl-Ong, Michael J Ryan, Josh Purtell, David Broman, Christopher Potts, Matei Zaharia, Omar Khattab

To make this tractable, we factorize our problem into optimizing the free-form instructions and few-shot demonstrations of every module and introduce several strategies to craft task-grounded instructions and navigate credit assignment across modules.

Language Modelling Navigate

Scorch: A Library for Sparse Deep Learning

no code implementations27 May 2024 Bobby Yan, Alexander J. Root, Trevor Gale, David Broman, Fredrik Kjolstad

To bridge this gap, we introduce Scorch, a library that seamlessly integrates efficient sparse tensor computation into the PyTorch ecosystem, with an initial focus on inference workloads on CPUs.

Deep Learning

Exact Worst-Case Execution-Time Analysis for Implicit Model Predictive Control

no code implementations23 Apr 2023 Daniel Arnström, David Broman, Daniel Axehill

For such solvers, we leverage a previously proposed complexity certification framework to generate a finite set of archetypal optimization problems; we prove that these archetypal problems form an execution-time equivalent cover of all possible problems; that is, that they capture the execution time for solving any possible optimization problem that can be encountered online.

Model Predictive Control

Delayed Sampling and Automatic Rao-Blackwellization of Probabilistic Programs

5 code implementations25 Aug 2017 Lawrence M. Murray, Daniel Lundén, Jan Kudlicka, David Broman, Thomas B. Schön

For inference with Sequential Monte Carlo, this automatically yields improvements such as locally-optimal proposals and Rao-Blackwellization.

Probabilistic Programming

Co-simulation: State of the art

2 code implementations1 Feb 2017 Cláudio Gomes, Casper Thule, David Broman, Peter Gorm Larsen, Hans Vangheluwe

It is essential to find new ways of enabling experts in different disciplines to collaborate more efficient in the development of ever more complex systems, under increasing market pressures.

Systems and Control 65Y10 I.6.1; I.6.7

Sparse Partially Collapsed MCMC for Parallel Inference in Topic Models

1 code implementation11 Jun 2015 Måns Magnusson, Leif Jonsson, Mattias Villani, David Broman

We propose a parallel sparse partially collapsed Gibbs sampler and compare its speed and efficiency to state-of-the-art samplers for topic models on five well-known text corpora of differing sizes and properties.

Topic Models

Cannot find the paper you are looking for? You can Submit a new open access paper.