no code implementations • 14 May 2024 • Chendi Wang, Yuqing Zhu, Weijie J. Su, Yu-Xiang Wang
A recent study by De et al. (2022) has reported that large-scale representation learning through pre-training on a public dataset significantly enhances differentially private (DP) learning in downstream tasks, despite the high dimensionality of the feature space.
no code implementations • 27 Apr 2024 • Wenzhen Yue, Xianghua Ying, Ruohao Guo, Dongdong Chen, Ji Shi, Bowei Xing, Yuqing Zhu, Taiyan Chen
By focusing the attention on the sub-adjacent areas, we make the reconstruction of anomalies more challenging, thereby enhancing their detectability.
no code implementations • CVPR 2024 • Yuan Gao, Yuqing Zhu, Xinjun Li, Yimin Du, Tianzhu Zhang
To address these challenges a novel event-based keypoint detection method is proposed by learning dynamic detectors and contextual descriptors in a self-supervised manner (SD2Event) including a contextual feature descriptor learning (CFDL) module and a dynamic keypoint detector learning (DKDL) module.
no code implementations • 30 Aug 2023 • Jiachen T. Wang, Yuqing Zhu, Yu-Xiang Wang, Ruoxi Jia, Prateek Mittal
Data valuation aims to quantify the usefulness of individual data sources in training machine learning (ML) models, and is a critical aspect of data-centric ML research.
1 code implementation • 12 Jun 2023 • Yuqing Zhu, Xuandong Zhao, Chuan Guo, Yu-Xiang Wang
Most existing approaches of differentially private (DP) machine learning focus on private training.
no code implementations • 31 Dec 2022 • Rachel Redberg, Yuqing Zhu, Yu-Xiang Wang
The ''Propose-Test-Release'' (PTR) framework is a classic recipe for designing differentially private (DP) algorithms that are data-adaptive, i. e. those that add less noise when the input dataset is nice.
no code implementations • 30 Mar 2022 • Yuqing Zhu, Yu-Xiang Wang
We provide an end-to-end Renyi DP based-framework for differentially private top-$k$ selection.
1 code implementation • 16 Jun 2021 • Yuqing Zhu, Jinshuo Dong, Yu-Xiang Wang
Characterizing the privacy degradation over compositions, i. e., privacy accounting, is a fundamental topic in differential privacy (DP) with many applications to differentially private machine learning and federated learning.
no code implementations • 16 Jan 2021 • Mengying Guo, Tao Yi, Yuqing Zhu, Yungang Bao
Although AutoML methods have been applied to the hyperparameter tuning of NE algorithms, the problem of how to tune hyperparameters in a given period of time is not studied for NE algorithms before.
no code implementations • NeurIPS 2020 • Yuqing Zhu, Yu-Xiang Wang
The Sparse Vector Technique (SVT) is one of the most fundamental algorithmic tools in differential privacy (DP).
no code implementations • 6 Nov 2020 • Chong Liu, Yuqing Zhu, Kamalika Chaudhuri, Yu-Xiang Wang
The Private Aggregation of Teacher Ensembles (PATE) framework is one of the most promising recent approaches in differentially private learning.
no code implementations • 9 Oct 2020 • Yuqing Zhu, Xiang Yu, Yi-Hsuan Tsai, Francesco Pittaluga, Masoud Faraki, Manmohan Chandraker, Yu-Xiang Wang
Differentially Private Federated Learning (DPFL) is an emerging field with many applications.
no code implementations • CVPR 2020 • Yuqing Zhu, Xiang Yu, Manmohan Chandraker, Yu-Xiang Wang
With increasing ethical and legal concerns on privacy for deep models in visual recognition, differential privacy has emerged as a mechanism to disguise membership of sensitive data in training datasets.
no code implementations • 12 Oct 2019 • Yuqing Zhu, Jianxun Liu
Performance tuning can improve the system performance and thus enable the reduction of cloud computing resources needed to support an application.
1 code implementation • 10 Oct 2017 • Yuqing Zhu, Jianxun Liu, Mengying Guo, Yungang Bao, Wenlong Ma, Zhuoyue Liu, Kunpeng Song, Yingchun Yang
To help users tap the performance potential of systems, we present BestConfig, a system for automatically finding a best configuration setting within a resource limit for a deployed system under a given application workload.
Performance Databases Distributed, Parallel, and Cluster Computing Software Engineering
1 code implementation • 4 Aug 2017 • Yuqing Zhu, Jianxun Liu, Mengying Guo, Wenlong Ma, Yungang Bao
To support the variety of Big Data use cases, many Big Data related systems expose a large number of user-specifiable configuration parameters.
Distributed, Parallel, and Cluster Computing