no code implementations • 7 Jun 2022 • Virendra J. Marathe, Pallika Kanani, Daniel W. Peterson, Guy Steele Jr
We formally prove the subject level DP guarantee for our algorithms, and also show their effect on model utility loss.
no code implementations • 7 Jun 2022 • Anshuman Suri, Pallika Kanani, Virendra J. Marathe, Daniel W. Peterson
Using these attacks, we estimate subject membership inference risk on real-world data for single-party models as well as FL scenarios.
no code implementations • 12 Mar 2021 • Pallika Kanani, Virendra J. Marathe, Daniel Peterson, Rave Harpaz, Steve Bright
Users can indirectly contribute to, and directly benefit from a much larger aggregate data corpus used to train the global model.
no code implementations • 20 Dec 2019 • Laura Dietz, Bhaskar Mitra, Jeremy Pickens, Hana Anber, Sandeep Avula, Asia Biega, Adrian Boteanu, Shubham Chatterjee, Jeff Dalton, Shiri Dori-Hacohen, John Foley, Henry Feild, Ben Gamari, Rosie Jones, Pallika Kanani, Sumanta Kashyapi, Widad Machmouchi, Matthew Mitsui, Steve Nole, Alexandre Tachard Passos, Jordan Ramsdell, Adam Roegiest, David Smith, Alessandro Sordoni
The vision of HIPstIR is that early stage information retrieval (IR) researchers get together to develop a future for non-mainstream ideas and research agendas in IR.
no code implementations • 13 Dec 2019 • Daniel Peterson, Pallika Kanani, Virendra J. Marathe
Federated Learning (FL) is a distributed machine learning (ML) paradigm that enables multiple parties to jointly re-train a shared model without sharing their data with any other parties, offering advantages in both scale and privacy.