Search Results for author: Peter Hayes

Found 8 papers, 1 papers with code

Mafin: Enhancing Black-Box Embeddings with Model Augmented Fine-Tuning

no code implementations19 Feb 2024 Mingtian Zhang, Shawn Lan, Peter Hayes, David Barber

Our results demonstrate that Mafin significantly enhances the performance of the black-box embeddings by only requiring the training of a small augmented model.

Retrieval

Active Preference Learning for Large Language Models

no code implementations12 Feb 2024 William Muldrew, Peter Hayes, Mingtian Zhang, David Barber

A key consideration for aligning these models is how to most effectively use human resources, or model resources in the case where LLMs themselves are used as oracles.

Active Learning Language Modelling

Towards Healing the Blindness of Score Matching

no code implementations15 Sep 2022 Mingtian Zhang, Oscar Key, Peter Hayes, David Barber, Brooks Paige, François-Xavier Briol

Score-based divergences have been widely used in machine learning and statistics applications.

Density Estimation

Integrated Weak Learning

no code implementations19 Jun 2022 Peter Hayes, Mingtian Zhang, Raza Habib, Jordan Burgess, Emine Yilmaz, David Barber

We introduce a label model that can learn to aggregate weak supervision sources differently for different datapoints and takes into consideration the performance of the end-model during training.

Generalization Gap in Amortized Inference

1 code implementation23 May 2022 Mingtian Zhang, Peter Hayes, David Barber

The ability of likelihood-based probabilistic models to generalize to unseen data is central to many machine learning applications such as lossless compression.

Sample Efficient Model Evaluation

no code implementations24 Sep 2021 Emine Yilmaz, Peter Hayes, Raza Habib, Jordan Burgess, David Barber

Labelling data is a major practical bottleneck in training and testing classifiers.

Estimating the Uncertainty of Neural Network Forecasts for Influenza Prevalence Using Web Search Activity

no code implementations26 May 2021 Michael Morris, Peter Hayes, Ingemar J. Cox, Vasileios Lampos

In this paper, we demonstrate how Bayesian Neural Networks (BNNs) can be used to both provide a forecast and a corresponding uncertainty without significant loss in forecasting accuracy compared to traditional NNs.

Decision Making

Spread Divergence

no code implementations21 Nov 2018 Mingtian Zhang, Peter Hayes, Tom Bird, Raza Habib, David Barber

For distributions $\mathbb{P}$ and $\mathbb{Q}$ with different supports or undefined densities, the divergence $\textrm{D}(\mathbb{P}||\mathbb{Q})$ may not exist.

Cannot find the paper you are looking for? You can Submit a new open access paper.