1 code implementation • 15 Jul 2022 • Guoxuan Xia, Christos-Savvas Bouganis
However, the performance of detection methods is generally evaluated on the task in isolation, rather than also considering potential downstream tasks in tandem.
1 code implementation • ICCV 2023 • Guoxuan Xia, Christos-Savvas Bouganis
Experiments on ImageNet-scale data across a number of network architectures and uncertainty tasks show that the proposed window-based early-exit approach is able to achieve a superior uncertainty-computation trade-off compared to scaling single models.
1 code implementation • 9 Oct 2023 • Zehui Li, Yuhao Ni, Tim August B. Huygelen, Akashaditya Das, Guoxuan Xia, Guy-Bart Stan, Yiren Zhao
On the other hand, Diffusion Models are a promising new class of generative models that are not burdened with these problems, enabling them to reach the state-of-the-art in domains such as image generation.
1 code implementation • 31 Oct 2023 • Guoxuan Xia, Duolikun Danier, Ayan Das, Stathi Fotiadis, Farhang Nabiei, Ushnish Sengupta, Alberto Bernacchia
As a simple fix, we propose to instead reparameterise the score (at inference) by dividing it by the average absolute value of previous score estimates at that time step collected from offline high NFE generations.
1 code implementation • NeurIPS Workshop ICBINB 2021 • Guoxuan Xia, Sangwon Ha, Tiago Azevedo, Partha Maji
We show that this robustness can be partially explained by the calibration behavior of modern CNNs, and may be improved with overconfidence.
1 code implementation • 15 Jul 2022 • Guoxuan Xia, Christos-Savvas Bouganis
As such we show that practically, even better OOD detection performance can be achieved for Deep Ensembles by averaging task-specific detection scores such as Energy over the ensemble.
no code implementations • 17 May 2023 • Yassir Fathullah, Guoxuan Xia, Mark Gales
Efficiently and reliably estimating uncertainty is an important objective in deep learning.
no code implementations • 8 Feb 2024 • Zehui Li, Yuhao Ni, William A V Beardall, Guoxuan Xia, Akashaditya Das, Guy-Bart Stan, Yiren Zhao
This paper introduces a novel framework for DNA sequence generation, comprising two key components: DiscDiff, a Latent Diffusion Model (LDM) tailored for generating discrete DNA sequences, and Absorb-Escape, a post-training algorithm designed to refine these sequences.
no code implementations • 19 Mar 2024 • Guoxuan Xia, Olivier Laurent, Gianni Franchi, Christos-Savvas Bouganis
We first demonstrate empirically across a range of tasks and architectures that LS leads to a consistent degradation in SC.