In our cryptographic protocol, the server first computes a proof that the model was trained on a dataset~$D$.
We contribute a formal analysis of why the PoL protocol cannot be formally (dis)proven to be robust against spoofing adversaries.
Federated learning (FL), where data remains at the federated clients, and where only gradient updates are shared with a central aggregator, was assumed to be private.
The application of machine learning (ML) in computer systems introduces not only many benefits but also risks to society.
In the white-box setting, we instantiate this class with a joint, multi-stage optimization attack.
In particular, our analyses and experiments show that an adversary seeking to illegitimately manufacture a proof-of-learning needs to perform *at least* as much work than is needed for gradient descent itself.
We propose an unsupervised anomaly detection framework based on the internal DNN layer representations in the form of a meta-algorithm with configurable components.
We implement and evaluate Face-Off to find that it deceives three commercial face recognition services from Microsoft, Amazon, and Face++.
Cryptography and Security
Such pairs are watermarks, which are not sampled from the task distribution and are only known to the defender.
In this work, we study the feasibility of an attack-agnostic defense relying on artifacts that are common to all poisoning attacks.
Once users have shared their data online, it is generally difficult for them to revoke access and ask for the data to be deleted.
In this paper we explore semantic adversarial examples (SAEs) where an attacker creates perturbations in the semantic space representing the environment that produces input for the ML model.
and how to design a classification paradigm that leverages these invariances to improve the robustness accuracy trade-off?
This has resulted in the surge of Machine Learning-as-a-Service (MLaaS) - cloud services that provide (a) tools and resources to learn the model, and (b) a user-friendly query interface to access the model.