no code implementations • 1 Feb 2024 • Jack Furby, Daniel Cunnington, Dave Braines, Alun Preece
We hypothesise that this occurs when concept annotations are inaccurate or how input features should relate to concepts is unclear.
1 code implementation • 7 Feb 2023 • Jack Furby, Daniel Cunnington, Dave Braines, Alun Preece
Concept Bottleneck Models (CBMs) first map raw input(s) to a vector of human-defined concepts, before using this vector to predict a final classification.
no code implementations • 27 Oct 2020 • Katie Barrett-Powell, Jack Furby, Liam Hiley, Marc Roig Vilamala, Harrison Taylor, Federico Cerutti, Alun Preece, Tianwei Xing, Luis Garcia, Mani Srivastava, Dave Braines
We present an experimentation platform for coalition situational understanding research that highlights capabilities in explainable artificial intelligence/machine learning (AI/ML) and integration of symbolic and subsymbolic AI/ML approaches for event processing.
BIG-bench Machine Learning Explainable artificial intelligence