2 code implementations • CVPR 2020 • Peyman Bateni, Raghav Goyal, Vaden Masrani, Frank Wood, Leonid Sigal
Few-shot learning is a fundamental task in computer vision that carries the promise of alleviating the need for exhaustively labeled data.
Ranked #2 on Few-Shot Image Classification on Mini-Imagenet 10-way (5-shot) (using extra training data)
2 code implementations • 13 Jan 2022 • Peyman Bateni, Jarred Barber, Raghav Goyal, Vaden Masrani, Jan-Willem van de Meent, Leonid Sigal, Frank Wood
The first method, Simple CNAPS, employs a hierarchically regularized Mahalanobis-distance based classifier combined with a state of the art neural adaptive feature extractor to achieve strong performance on Meta-Dataset, mini-ImageNet and tiered-ImageNet benchmarks.
1 code implementation • 23 May 2022 • William Harvey, Saeid Naderiparizi, Vaden Masrani, Christian Weilbach, Frank Wood
We present a framework for video modeling based on denoising diffusion probabilistic models that produces long-duration video completions in a variety of realistic environments.
1 code implementation • 30 Mar 2020 • Frank Wood, Andrew Warrington, Saeid Naderiparizi, Christian Weilbach, Vaden Masrani, William Harvey, Adam Scibior, Boyan Beronov, John Grefenstette, Duncan Campbell, Ali Nasseri
In this work we demonstrate how to automate parts of the infectious disease-control policy-making process via performing inference in existing epidemiological models.
1 code implementation • WS 2017 • Vaden Masrani, Gabriel Murray, Thalia Field, Giuseppe Carenini
We investigate if writers with dementia can be automatically distinguished from those without by analyzing linguistic markers in written text, in the form of blog posts.
1 code implementation • NeurIPS 2019 • Vaden Masrani, Tuan Anh Le, Frank Wood
We introduce the thermodynamic variational objective (TVO) for learning in both continuous and discrete deep generative models.
1 code implementation • 1 Jul 2020 • Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan
We propose to choose intermediate distributions using equal spacing in the moment parameters of our exponential family, which matches grid search performance and allows the schedule to adaptively update over the course of training.
1 code implementation • NeurIPS 2020 • Vu Nguyen, Vaden Masrani, Rob Brekelmans, Michael A. Osborne, Frank Wood
Achieving the full promise of the Thermodynamic Variational Objective (TVO), a recently proposed variational lower bound on the log evidence involving a one-dimensional Riemann integral approximation, requires choosing a "schedule" of sorted discretization points.
2 code implementations • NeurIPS Workshop DL-IG 2020 • Rob Brekelmans, Vaden Masrani, Thang Bui, Frank Wood, Aram Galstyan, Greg Ver Steeg, Frank Nielsen
Annealed importance sampling (AIS) is the gold standard for estimating partition functions or marginal likelihoods, corresponding to importance sampling over a path of distributions between a tractable base and an unnormalized target.
1 code implementation • 28 Mar 2024 • Mohsen Gholami, Mohammad Akbari, Cindy Hu, Vaden Masrani, Z. Jane Wang, Yong Zhang
Knowledge distillation from LLMs is essential for the efficient deployment of language models.
no code implementations • WS 2017 • Jordon Johnson, Vaden Masrani, Giuseppe Carenini, Raymond Ng
We define and motivate the problem of summarizing partial email threads.
no code implementations • ICML 2020 • Rob Brekelmans, Vaden Masrani, Frank Wood, Greg Ver Steeg, Aram Galstyan
While the Evidence Lower Bound (ELBO) has become a ubiquitous objective for variational inference, the recently proposed Thermodynamic Variational Objective (TVO) leverages thermodynamic integration to provide a tighter and more general family of bounds.
1 code implementation • 1 Jul 2021 • Vaden Masrani, Rob Brekelmans, Thang Bui, Frank Nielsen, Aram Galstyan, Greg Ver Steeg, Frank Wood
Many common machine learning methods involve the geometric annealing path, a sequence of intermediate densities between two distributions of interest constructed using the geometric average.
no code implementations • 1 Jul 2021 • Vaden Masrani
In this short note I restate and simplify the proof of the impossibility of probabilistic induction from Popper (1992).