no code implementations • 21 Aug 2024 • Jane Castleman, Aleksandra Korolova
Our study evaluates the effectiveness of Meta's "See less" ad control and the actionability of ad targeting explanations following the shift to AI-mediated targeting.
no code implementations • 14 Feb 2024 • Siddartha Devic, Aleksandra Korolova, David Kempe, Vatsal Sharan
However, when predictors trained for classification tasks have intrinsic uncertainty, it is not obvious how this uncertainty should be represented in the derived rankings.
no code implementations • 8 Feb 2023 • Siddartha Devic, David Kempe, Vatsal Sharan, Aleksandra Korolova
The prevalence and importance of algorithmic two-sided marketplaces has drawn attention to the issue of fairness in such settings.
1 code implementation • 24 Jun 2022 • Marc Juarez, Aleksandra Korolova
As in traditional machine learning models, models trained with federated learning may exhibit disparate performance across demographic groups.
no code implementations • NeurIPS 2021 • Zeyu Shen, Lodewijk Gelauff, Ashish Goel, Aleksandra Korolova, Kamesh Munagala
We show in a formal sense that the Nash Welfare rule that maximizes product of agent values is uniquely positioned to be robust when diversity constraints are introduced, while almost all other natural allocation rules fail this criterion.
no code implementations • 18 Dec 2019 • Amos Beimel, Aleksandra Korolova, Kobbi Nissim, Or Sheffet, Uri Stemmer
Motivated by the desire to bridge the utility gap between local and trusted curator models of differential privacy for practical applications, we initiate the theoretical study of a hybrid model introduced by "Blender" [Avent et al.,\ USENIX Security '17], in which differentially private protocols of n agents that work in the local-model are assisted by a differentially private curator that has access to the data of m additional users.
9 code implementations • 10 Dec 2019 • Peter Kairouz, H. Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, Rafael G. L. D'Oliveira, Hubert Eichner, Salim El Rouayheb, David Evans, Josh Gardner, Zachary Garrett, Adrià Gascón, Badih Ghazi, Phillip B. Gibbons, Marco Gruteser, Zaid Harchaoui, Chaoyang He, Lie He, Zhouyuan Huo, Ben Hutchinson, Justin Hsu, Martin Jaggi, Tara Javidi, Gauri Joshi, Mikhail Khodak, Jakub Konečný, Aleksandra Korolova, Farinaz Koushanfar, Sanmi Koyejo, Tancrède Lepoint, Yang Liu, Prateek Mittal, Mehryar Mohri, Richard Nock, Ayfer Özgür, Rasmus Pagh, Mariana Raykova, Hang Qi, Daniel Ramage, Ramesh Raskar, Dawn Song, Weikang Song, Sebastian U. Stich, Ziteng Sun, Ananda Theertha Suresh, Florian Tramèr, Praneeth Vepakomma, Jianyu Wang, Li Xiong, Zheng Xu, Qiang Yang, Felix X. Yu, Han Yu, Sen Zhao
FL embodies the principles of focused data collection and minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science approaches.
no code implementations • 3 Apr 2019 • Michael P. Kim, Aleksandra Korolova, Guy N. Rothblum, Gal Yona
We introduce and study a new notion of preference-informed individual fairness (PIIF) that is a relaxation of both individual fairness and envy-freeness.
no code implementations • 29 Nov 2018 • Brendan Avent, Yatharth Dubey, Aleksandra Korolova
We explore the power of the hybrid model of differential privacy (DP), in which some users desire the guarantees of the local model of DP and others are content with receiving the trusted-curator model guarantees.
no code implementations • 11 Jul 2018 • Vasyl Pihur, Aleksandra Korolova, Frederick Liu, Subhash Sankuratripati, Moti Yung, Dachuan Huang, Ruogu Zeng
In this work, we propose a novel framework for privacy-preserving client-distributed machine learning.
no code implementations • 8 Sep 2017 • Jun Tang, Aleksandra Korolova, Xiaolong Bai, Xueqiang Wang, Xiao-Feng Wang
We discover and describe Apple's set-up for differentially private data processing, including the overall data pipeline, the parameters used for differentially private perturbation of each piece of data, and the frequency with which such data is sent to Apple's servers.
no code implementations • 2 May 2017 • Brendan Avent, Aleksandra Korolova, David Zeber, Torgeir Hovden, Benjamin Livshits
We propose a hybrid model of differential privacy that considers a combination of regular and opt-in users who desire the differential privacy guarantees of the local privacy model and the trusted curator model, respectively.
1 code implementation • 25 Jul 2014 • Úlfar Erlingsson, Vasyl Pihur, Aleksandra Korolova
Randomized Aggregatable Privacy-Preserving Ordinal Response, or RAPPOR, is a technology for crowdsourcing statistics from end-user client software, anonymously, with strong privacy guarantees.
Cryptography and Security