3 code implementations • 25 Mar 2020 • Shuai Tang, Wesley J. Maddox, Charlie Dickens, Tom Diethe, Andreas Damianou
A suitable similarity index for comparing learnt neural networks plays an important role in understanding the behaviour of the highly-nonlinear functions, and can provide insights on further theoretical analysis and empirical studies.
1 code implementation • 26 May 2019 • Brendan Avent, Javier Gonzalez, Tom Diethe, Andrei Paleyes, Borja Balle
Differential privacy is a mathematical framework for privacy-preserving data analysis.
1 code implementation • 18 Oct 2023 • Chen Jin, Ryutaro Tanno, Amrutha Saseendran, Tom Diethe, Philip Teare
To address this challenge, we introduce a framework for Multi-Concept Prompt Learning (MCPL), where multiple new "words" are simultaneously learned from a single sentence-image pair.
1 code implementation • 20 Oct 2019 • Oluwaseyi Feyisetan, Borja Balle, Thomas Drake, Tom Diethe
We conduct privacy audit experiments against 2 baseline models and utility experiments on 3 datasets to demonstrate the tradeoff between privacy and utility for varying values of epsilon on different task types.
1 code implementation • 7 Aug 2019 • Tom Diethe, Meelis Kull, Niall Twomey, Kacper Sokol, Hao Song, Miquel Perello-Nieto, Emma Tonkin, Peter Flach
This paper describes HyperStream, a large-scale, flexible and robust software package, written in the Python language, for processing streaming data with workflow creation capabilities.
1 code implementation • 10 Mar 2019 • Yu Chen, Telmo Silva Filho, Ricardo B. C. Prudêncio, Tom Diethe, Peter Flach
Item Response Theory (IRT) aims to assess latent abilities of respondents based on the correctness of their answers in aptitude test items with different difficulty levels.
1 code implementation • 25 Jan 2024 • Talip Ucar, Aubin Ramon, Dino Oglic, Rebecca Croasdale-Wood, Tom Diethe, Pietro Sormanni
We investigate the potential of patent data for improving the antibody humanness prediction using a multi-stage, multi-loss training process.
no code implementations • 4 Feb 2017 • Tom Diethe, Niall Twomey, Meelis Kull, Peter Flach, Ian Craddock
There is a widely-accepted need to revise current forms of health-care provision, with particular interest in sensing systems in the home.
no code implementations • 25 Feb 2015 • Tom Diethe
We present a derivation of the Kullback Leibler (KL)-Divergence (also known as Relative Entropy) for the von Mises Fisher (VMF) Distribution in $d$-dimensions.
no code implementations • 12 Mar 2019 • Tom Diethe, Tom Borchert, Eno Thereska, Borja Balle, Neil Lawrence
This paper describes a reference architecture for self-maintaining systems that can learn continually, as data arrives.
no code implementations • 26 Mar 2019 • Oluwaseyi Feyisetan, Thomas Drake, Borja Balle, Tom Diethe
Active learning holds promise of significantly reducing data annotation costs while maintaining reasonable model performance.
no code implementations • 24 Apr 2019 • Yu Chen, Tom Diethe, Neil Lawrence
Conventional models tend to forget the knowledge of previous tasks while learning a new task, a phenomenon known as catastrophic forgetting.
no code implementations • 15 May 2019 • Hao Song, Tom Diethe, Meelis Kull, Peter Flach
We are concerned with obtaining well-calibrated output distributions from regression models.
no code implementations • 20 Oct 2019 • Oluwaseyi Feyisetan, Tom Diethe, Thomas Drake
In this work, we explore word representations in Hyperbolic space as a means of preserving privacy in text.
no code implementations • ICML 2020 • Jeremias Knoblauch, Hisham Husain, Tom Diethe
Continual Learning (CL) algorithms incrementally learn a predictor or representation across multiple sequentially observed tasks.
1 code implementation • 19 Jun 2020 • Yu Chen, Tom Diethe, Peter Flach
The use of episodic memory in continual learning has demonstrated effectiveness for alleviating catastrophic forgetting.
no code implementations • 4 Aug 2020 • Charlie Dickens, Eric Meissner, Pablo G. Moreno, Tom Diethe
Anomaly detection at scale is an extremely challenging problem of great practicality.
no code implementations • 9 Mar 2021 • Yu Chen, Song Liu, Tom Diethe, Peter Flach
To the best of our knowledge, there is no existing method that can evaluate generative models in continual learning without storing samples from the original distribution.
no code implementations • 25 Sep 2019 • Yu Chen, Song Liu, Tom Diethe, Peter Flach
We propose a new method Continual Density Ratio Estimation (CDRE), which can estimate density ratios between a target distribution of real samples and a distribution of samples generated by a model while the model is changing over time and the data of the target distribution is not available after a certain time point.
no code implementations • 28 Sep 2020 • Yu Chen, Tom Diethe, Peter Flach
The use of episodic memories in continual learning has been shown to be effective in terms of alleviating catastrophic forgetting.
no code implementations • 21 Sep 2023 • Sylwia Majchrowska, Anders Hildeman, Philip Teare, Tom Diethe
Supervised training of deep learning models for medical imaging applications requires a significant amount of labeled data.
no code implementations • 9 Apr 2024 • Seunghoi Kim, Chen Jin, Tom Diethe, Matteo Figini, Henry F. J. Tregidgo, Asher Mullokandov, Philip Teare, Daniel C. Alexander
We hypothesize such hallucinations result from local OOD regions in the conditional images.