Search Results for author: Adam Prugel-Bennett

Found 8 papers, 2 papers with code

Concept-Based Explainable Artificial Intelligence: Metrics and Benchmarks

no code implementations31 Jan 2025 Halil Ibrahim Aysel, Xiaohao Cai, Adam Prugel-Bennett

Concept-based explanation methods, such as concept bottleneck models (CBMs), aim to improve the interpretability of machine learning models by linking their decisions to human-understandable concepts, under the critical assumption that such concepts can be accurately attributed to the network's feature space.

Explainable artificial intelligence

Revisiting Cross-Domain Problem for LiDAR-based 3D Object Detection

no code implementations22 Aug 2024 RuiXiao Zhang, Juheon Lee, Xiaohao Cai, Adam Prugel-Bennett

Deep learning models such as convolutional neural networks and transformers have been widely applied to solve 3D object detection problems in the domain of autonomous driving.

3D Object Detection Autonomous Driving +2

Penny-Wise and Pound-Foolish in Deepfake Detection

1 code implementation15 Aug 2024 Yabin Wang, Zhiwu Huang, Su Zhou, Adam Prugel-Bennett, Xiaopeng Hong

This paper critiques the overly specialized approach of fine-tuning pre-trained models solely with a penny-wise objective on a single deepfake dataset, while disregarding the pound-wise balance for generalization and knowledge retention.

DeepFake Detection Face Swapping +1

Detect Closer Surfaces that can be Seen: New Modeling and Evaluation in Cross-domain 3D Object Detection

1 code implementation4 Jul 2024 RuiXiao Zhang, Yihong Wu, Juheon Lee, Adam Prugel-Bennett, Xiaohao Cai

This raises a fundamental question related to the evaluation of the 3D object detection models' cross-domain performance: Do we really need models to maintain excellent performance in their original 3D bounding boxes after being applied across domains?

3D Object Detection Autonomous Driving +2

Orthogonalising gradients to speedup neural network optimisation

no code implementations29 Sep 2021 Mark Tuddenham, Adam Prugel-Bennett, Jonathon Hare

The optimisation of neural networks can be sped up by orthogonalising the gradients before the optimisation step, ensuring the diversification of the learned representations.

Generalisation and the Geometry of Class Separability

no code implementations NeurIPS Workshop DL-IG 2020 Dominic Belcher, Adam Prugel-Bennett, Srinandan Dasmahapatra

Recent results in deep learning show that considering only the capacity of machines does not adequately explain the generalisation performance we can observe.

Deep Learning

Linear Disentangled Representations and Unsupervised Action Estimation

no code implementations NeurIPS 2020 Matthew Painter, Jonathon Hare, Adam Prugel-Bennett

In this work we empirically show that linear disentangled representations are not generally present in standard VAE models and that they instead require altering the loss landscape to induce them.

Disentanglement

Extended Formulations for Online Linear Bandit Optimization

no code implementations20 Nov 2013 Shaona Ghosh, Adam Prugel-Bennett

On-line linear optimization on combinatorial action sets (d-dimensional actions) with bandit feedback, is known to have complexity in the order of the dimension of the problem.

Efficient Exploration

Cannot find the paper you are looking for? You can Submit a new open access paper.