Search Results for author: Changchang Liu

Found 6 papers, 2 papers with code

Federated Learning for Semantic Parsing: Task Formulation, Evaluation Setup, New Algorithms

1 code implementation26 May 2023 Tianshu Zhang, Changchang Liu, Wei-Han Lee, Yu Su, Huan Sun

By leveraging data from multiple clients, the FL paradigm can be especially beneficial for clients that have little training data to develop a data-hungry neural semantic parser on their own.

Federated Learning Semantic Parsing +1

Joint Coreset Construction and Quantization for Distributed Machine Learning

no code implementations13 Apr 2022 Hanlin Lu, Changchang Liu, Shiqiang Wang, Ting He, Vijay Narayanan, Kevin S. Chan, Stephen Pasteris

Coresets are small, weighted summaries of larger datasets, aiming at providing provable error bounds for machine learning (ML) tasks while significantly reducing the communication and computation costs.

BIG-bench Machine Learning Quantization

Communication-efficient k-Means for Edge-based Machine Learning

no code implementations8 Feb 2021 Hanlin Lu, Ting He, Shiqiang Wang, Changchang Liu, Mehrdad Mahdavi, Vijaykrishnan Narayanan, Kevin S. Chan, Stephen Pasteris

We consider the problem of computing the k-means centers for a large high-dimensional dataset in the context of edge-based machine learning, where data sources offload machine learning computation to nearby edge servers.

BIG-bench Machine Learning Dimensionality Reduction +1

Sharing Models or Coresets: A Study based on Membership Inference Attack

no code implementations6 Jul 2020 Hanlin Lu, Changchang Liu, Ting He, Shiqiang Wang, Kevin S. Chan

Distributed machine learning generally aims at training a global model based on distributed data without collecting all the data to a centralized location, where two different approaches have been proposed: collecting and aggregating local models (federated learning) and collecting and training over representative data summaries (coreset).

Federated Learning Inference Attack +1

Overcoming Noisy and Irrelevant Data in Federated Learning

no code implementations22 Jan 2020 Tiffany Tuor, Shiqiang Wang, Bong Jun Ko, Changchang Liu, Kin K. Leung

A challenge is that among the large variety of data collected at each client, it is likely that only a subset is relevant for a learning task while the rest of data has a negative impact on model training.

Federated Learning

Coupling Random Orthonormal Projection with Gaussian Generative Model for Non-Interactive Private Data Release

1 code implementation31 Aug 2017 Thee Chanyaswad, Changchang Liu, Prateek Mittal

A key challenge facing the design of differential privacy in the non-interactive setting is to maintain the utility of the released data.

Cryptography and Security

Cannot find the paper you are looking for? You can Submit a new open access paper.