Probabilistic Programming
87 papers with code • 0 benchmarks • 0 datasets
Probabilistic programming languages are designed to describe probabilistic models and then perform inference in those models. PPLs are closely related to graphical models and Bayesian networks, but are more expressive and flexible.
( Image credit: Michael Betancourt )
Benchmarks
These leaderboards are used to track progress in Probabilistic Programming
Libraries
Use these libraries to find Probabilistic Programming models and implementationsMost implemented papers
Automatic structured variational inference
However, the performance of the variational approach depends on the choice of an appropriate variational family.
DynamicPPL: Stan-like Speed for Dynamic Probabilistic Models
Since DynamicPPL is a modular, stand-alone library, any probabilistic programming system written in Julia, such as Turing. jl, can use DynamicPPL to specify models and trace their model parameters.
$π$VAE: a stochastic process prior for Bayesian deep learning with MCMC
We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions to enable statistical inference (such as the integral of a log Gaussian process).
Scenic: A Language for Scenario Specification and Data Generation
We design a domain-specific language, Scenic, for describing scenarios that are distributions over scenes and the behaviors of their agents over time.
Sequential Monte Carlo Steering of Large Language Models using Probabilistic Programs
Even after fine-tuning and reinforcement learning, large language models (LLMs) can be difficult, if not impossible, to control reliably with prompts alone.
SymbolicAI: A framework for logic-based approaches combining generative models and solvers
We conclude by introducing a quality measure and its empirical score for evaluating these computational graphs, and propose a benchmark that compares various state-of-the-art LLMs across a set of complex workflows.
Dataflow Matrix Machines as a Generalization of Recurrent Neural Networks
Dataflow matrix machines are a powerful generalization of recurrent neural networks.
Probabilistic Data Analysis with Probabilistic Programming
This paper introduces composable generative population models (CGPMs), a computational abstraction that extends directed graphical models and can be used to describe and compose a broad class of probabilistic data analysis techniques.
Deep Amortized Inference for Probabilistic Programs
This paper proposes a system for amortized inference in PPLs.
Detecting Dependencies in Sparse, Multivariate Databases Using Probabilistic Programming and Non-parametric Bayes
Datasets with hundreds of variables and many missing values are commonplace.