1 code implementation • 23 Feb 2021 • Konrad Kollnig, Siddhartha Datta, Max Van Kleek
Dark patterns in mobile apps take advantage of cognitive biases of end-users and can have detrimental effects on people's lives.
no code implementations • 20 Jan 2021 • Nitin Agrawal, Reuben Binns, Max Van Kleek, Kim Laine, Nigel Shadbolt
Homomorphic encryption, secure multi-party computation, and differential privacy are part of an emerging class of Privacy Enhancing Technologies which share a common promise: to preserve privacy whilst also obtaining the benefits of computational analysis.
no code implementations • 19 May 2020 • Petar Radanliev, David De Roure, Kevin Page, Max Van Kleek, Omar Santos, La Treall Maddox, Pete Burnap, Eirini Anthi, Carsten Maple
This paper surveys deep learning algorithms, IoT cyber security and risk models, and established mathematical formulas to identify the best approach for developing a dynamic and self adapting system for predictive cyber risk analytics supported with Artificial Intelligence and Machine Learning and real time intelligence in edge computing.
no code implementations • 16 Mar 2018 • Michael Veale, Reuben Binns, Max Van Kleek
In this short paper, we consider the roles of HCI in enabling the better governance of consequential machine learning systems using the rights and obligations laid out in the recent 2016 EU General Data Protection Regulation (GDPR)---a law which involves heavy interaction with people and systems.
no code implementations • 3 Feb 2018 • Michael Veale, Max Van Kleek, Reuben Binns
Calls for heightened consideration of fairness and accountability in algorithmically-informed public decisions---like taxation, justice, and child protection---are now commonplace.
1 code implementation • 5 Jul 2017 • Reuben Binns, Michael Veale, Max Van Kleek, Nigel Shadbolt
This paper provides some exploratory methods by which the normative biases of algorithmic content moderation systems can be measured, by way of a case study using an existing dataset of comments labelled for offence.