Search Results for author: Jack Furby

Found 3 papers, 1 papers with code

Can we Constrain Concept Bottleneck Models to Learn Semantically Meaningful Input Features?

no code implementations1 Feb 2024 Jack Furby, Daniel Cunnington, Dave Braines, Alun Preece

We hypothesise that this occurs when concept annotations are inaccurate or how input features should relate to concepts is unclear.

Towards a Deeper Understanding of Concept Bottleneck Models Through End-to-End Explanation

1 code implementation7 Feb 2023 Jack Furby, Daniel Cunnington, Dave Braines, Alun Preece

Concept Bottleneck Models (CBMs) first map raw input(s) to a vector of human-defined concepts, before using this vector to predict a final classification.

An Experimentation Platform for Explainable Coalition Situational Understanding

no code implementations27 Oct 2020 Katie Barrett-Powell, Jack Furby, Liam Hiley, Marc Roig Vilamala, Harrison Taylor, Federico Cerutti, Alun Preece, Tianwei Xing, Luis Garcia, Mani Srivastava, Dave Braines

We present an experimentation platform for coalition situational understanding research that highlights capabilities in explainable artificial intelligence/machine learning (AI/ML) and integration of symbolic and subsymbolic AI/ML approaches for event processing.

BIG-bench Machine Learning Explainable artificial intelligence

Cannot find the paper you are looking for? You can Submit a new open access paper.