Search Results for author: Xiaoyun Li

Found 28 papers, 0 papers with code

Stochastic Controlled Averaging for Federated Learning with Communication Compression

no code implementations16 Aug 2023 Xinmeng Huang, Ping Li, Xiaoyun Li

The existing approaches either cannot accommodate arbitrary data heterogeneity or partial participation, or require stringent conditions on compression.

Federated Learning

CSPM: A Contrastive Spatiotemporal Preference Model for CTR Prediction in On-Demand Food Delivery Services

no code implementations10 Aug 2023 Guyu Jiang, Xiaoyun Li, Rongrong Jing, Ruoqi Zhao, Xingliang Ni, Guodong Cao, Ning Hu

Click-through rate (CTR) prediction is a crucial task in the context of an online on-demand food delivery (OFD) platform for precisely estimating the probability of a user clicking on food items.

Click-Through Rate Prediction Contrastive Learning +1

Differentially Private One Permutation Hashing and Bin-wise Consistent Weighted Sampling

no code implementations13 Jun 2023 Xiaoyun Li, Ping Li

Minwise hashing (MinHash) is a standard algorithm widely used in the industry, for large-scale search and learning applications with the binary (0/1) Jaccard similarity.

Differential Privacy with Random Projections and Sign Random Projections

no code implementations22 May 2023 Ping Li, Xiaoyun Li

Among the presented algorithms, iDP-SignRP is remarkably effective under the setting of ``individual differential privacy'' (iDP), based on sign random projections (SignRP).

Information Retrieval Quantization +1

Building K-Anonymous User Cohorts with\\ Consecutive Consistent Weighted Sampling (CCWS)

no code implementations26 Apr 2023 Xinyi Zheng, Weijie Zhao, Xiaoyun Li, Ping Li

To retrieve personalized campaigns and creatives while protecting user privacy, digital advertising is shifting from member-based identity to cohort-based identity.

OPORP: One Permutation + One Random Projection

no code implementations7 Feb 2023 Ping Li, Xiaoyun Li

We show that the estimation variance is essentially: $(s-1)A + \frac{D-k}{D-1}\frac{1}{k}\left[ (1-\rho^2)^2 -2A\right]$, where $A\geq 0$ is a function of the data ($u, v$).

Retrieval

Analysis of Error Feedback in Federated Non-Convex Optimization with Biased Compression

no code implementations25 Nov 2022 Xiaoyun Li, Ping Li

Moreover, we develop a new analysis of the EF under partial client participation, which is an important scenario in FL.

Federated Learning

$k$-Median Clustering via Metric Embedding: Towards Better Initialization with Differential Privacy

no code implementations26 Jun 2022 Chenglin Fan, Ping Li, Xiaoyun Li

When designing clustering algorithms, the choice of initial centers is crucial for the quality of the learned clusters.

Clustering

On Distributed Adaptive Optimization with Gradient Compression

no code implementations ICLR 2022 Xiaoyun Li, Belhal Karimi, Ping Li

We study COMP-AMS, a distributed optimization framework based on gradient averaging and adaptive AMSGrad algorithm.

Distributed Optimization

Communication-Efficient TeraByte-Scale Model Training Framework for Online Advertising

no code implementations5 Jan 2022 Weijie Zhao, Xuewu Jiao, Mingqing Hu, Xiaoyun Li, Xiangyu Zhang, Ping Li

In this paper, we propose a hardware-aware training workflow that couples the hardware topology into the algorithm design.

Click-Through Rate Prediction

C-OPH: Improving the Accuracy of One Permutation Hashing (OPH) with Circulant Permutations

no code implementations18 Nov 2021 Xiaoyun Li, Ping Li

Note that C-MinHash is different from the well-known work on "One Permutation Hashing (OPH)" published in NIPS'12.

Layer-wise and Dimension-wise Locally Adaptive Federated Learning

no code implementations1 Oct 2021 Belhal Karimi, Ping Li, Xiaoyun Li

In the emerging paradigm of Federated Learning (FL), large amount of clients such as mobile devices are used to train possibly high-dimensional models on their respective data.

Federated Learning

k-Median Clustering via Metric Embedding: Towards Better Initialization with Privacy

no code implementations29 Sep 2021 Chenglin Fan, Ping Li, Xiaoyun Li

Our method, named the HST initialization, can also be easily extended to the setting of differential privacy (DP) to generate private initial centers.

Clustering

Revisiting Locality-Sensitive Binary Codes from Random Fourier Features

no code implementations29 Sep 2021 Xiaoyun Li, Ping Li

We show the locality-sensitivity of SignRFF, and propose a new measure, called ranking efficiency, to theoretically compare different Locality-Sensitive Hashing (LSH) methods with practical implications.

Information Retrieval Quantization +1

C-MinHash: Improving Minwise Hashing with Circulant Permutation

no code implementations29 Sep 2021 Xiaoyun Li, Ping Li

Minwise hashing (MinHash) is an important and practical algorithm for generating random hashes to approximate the Jaccard (resemblance) similarity in massive binary (0/1) data.

Toward Communication Efficient Adaptive Gradient Method

no code implementations10 Sep 2021 Xiangyi Chen, Xiaoyun Li, Ping Li

While adaptive gradient methods have been proven effective for training neural nets, the study of adaptive gradient methods in federated learning is scarce.

BIG-bench Machine Learning Distributed Optimization +1

C-MinHash: Practically Reducing Two Permutations to Just One

no code implementations10 Sep 2021 Xiaoyun Li, Ping Li

That is, one single permutation is used for both the initial pre-processing step to break the structures in the data and the circulant hashing step to generate $K$ hashes.

Vocal Bursts Valence Prediction

C-MinHash: Rigorously Reducing $K$ Permutations to Two

no code implementations7 Sep 2021 Xiaoyun Li, Ping Li

Unlike classical MinHash, these $K$ hashes are obviously correlated, but we are able to provide rigorous proofs that we still obtain an unbiased estimate of the Jaccard similarity and the theoretical variance is uniformly smaller than that of the classical MinHash with $K$ independent permutations.

Vocal Bursts Valence Prediction

Quantization Algorithms for Random Fourier Features

no code implementations25 Feb 2021 Xiaoyun Li, Ping Li

Closely related to RP, the method of random Fourier features (RFF) has also become popular, for approximating the Gaussian kernel.

Dimensionality Reduction Quantization

FedSKETCH: Communication-Efficient and Private Federated Learning via Sketching

no code implementations11 Aug 2020 Farzin Haddadpour, Belhal Karimi, Ping Li, Xiaoyun Li

Communication complexity and privacy are the two key challenges in Federated Learning where the goal is to perform a distributed learning through a large volume of devices.

Federated Learning

Randomized Kernel Multi-view Discriminant Analysis

no code implementations2 Apr 2020 Xiaoyun Li, Jie Gui, Ping Li

In this paper, we propose the kernel version of multi-view discriminant analysis, called kernel multi-view discriminant analysis (KMvDA).

Object Recognition

Random Projections with Asymmetric Quantization

no code implementations NeurIPS 2019 Xiaoyun Li, Ping Li

The method of random projection has been a popular tool for data compression, similarity search, and machine learning.

Data Compression Quantization

Generalization Error Analysis of Quantized Compressive Learning

no code implementations NeurIPS 2019 Xiaoyun Li, Ping Li

In this paper, we consider the learning problem where the projected data is further compressed by scalar quantization, which is called quantized compressive learning.

Quantization

Zeroth Order Optimization by a Mixture of Evolution Strategies

no code implementations25 Sep 2019 Jun-Kun Wang, Xiaoyun Li, Ping Li

Perhaps the only methods that enjoy convergence guarantees are the ones that sample the perturbed points uniformly from a unit sphere or from a multivariate Gaussian distribution with an isotropic covariance.

An Optimistic Acceleration of AMSGrad for Nonconvex Optimization

no code implementations ICLR 2020 Jun-Kun Wang, Xiaoyun Li, Belhal Karimi, Ping Li

We propose a new variant of AMSGrad, a popular adaptive gradient based optimization algorithm widely used for training deep neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.