Search Results for author: Lili Su

Found 23 papers, 2 papers with code

Efficient Federated Learning against Heterogeneous and Non-stationary Client Unavailability

no code implementations26 Sep 2024 Ming Xiang, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong, Lili Su

Addressing intermittent client availability is critical for the real-world deployment of federated learning algorithms.

Federated Learning

Data-efficient Trajectory Prediction via Coreset Selection

no code implementations25 Sep 2024 Ruining Yang, Lili Su

In this paper, to mitigate data redundancy in the over-represented driving scenarios and to reduce the bias rooted in the data scarcity of complex ones, we propose a novel data-efficient training method based on coreset selection.

motion prediction Trajectory Prediction

Building Real-time Awareness of Out-of-distribution in Trajectory Prediction for Autonomous Vehicles

no code implementations25 Sep 2024 Tongfei, Guo, Taposh Banerjee, Rui Liu, Lili Su

Trajectory prediction describes the motions of surrounding moving obstacles for an autonomous vehicle; it plays a crucial role in enabling timely decision-making, such as collision avoidance and trajectory replanning.

Autonomous Vehicles Change Point Detection +5

Collaborative Learning with Shared Linear Representations: Statistical Rates and Optimal Algorithms

no code implementations7 Sep 2024 Xiaochun Niu, Lili Su, Jiaming Xu, Pengkun Yang

In this paper, we identify the optimal statistical rate when clients share a common low-dimensional linear representation.

Transfer Learning

On the Convergence Rates of Federated Q-Learning across Heterogeneous Environments

no code implementations5 Sep 2024 Muxing Wang, Pengkun Yang, Lili Su

We prove that, for a wide range of stepsizes, the $\ell_{\infty}$ norm of the error cannot decay faster than $\Theta (E/T)$.

Q-Learning

Fair Concurrent Training of Multiple Models in Federated Learning

no code implementations22 Apr 2024 Marie Siew, Haoran Zhang, Jong-Ik Park, Yuezhou Liu, Yichen Ruan, Lili Su, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong

We show how our fairness-based learning and incentive mechanisms impact training convergence and finally evaluate our algorithm with multiple sets of learning tasks on real world datasets.

Fairness Federated Learning

Empowering Federated Learning with Implicit Gossiping: Mitigating Connection Unreliability Amidst Unknown and Arbitrary Dynamics

no code implementations15 Apr 2024 Ming Xiang, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong, Lili Su

It consists of a parameter server and a possibly large collection of clients (e. g., in cross-device federated learning) that may operate in congested and changing environments.

Federated Learning

Towards Safe Autonomy in Hybrid Traffic: Detecting Unpredictable Abnormal Behaviors of Human Drivers via Information Sharing

no code implementations23 Aug 2023 Jiangwei Wang, Lili Su, Songyang Han, Dongjin Song, Fei Miao

Then through extensive experiments on SUMO simulator, we show that our proposed algorithm has great detection performance in both highway and urban traffic.

Autonomous Vehicles Trajectory Prediction

Fast and Robust State Estimation and Tracking via Hierarchical Learning

no code implementations29 Jun 2023 Connor Mclaughlin, Matthew Ding, Deniz Erdogmus, Lili Su

Fast and reliable state estimation and tracking are essential for real-time situation awareness in Cyber-Physical Systems (CPS) operating in tactical environments or complicated civilian environments.

Towards Bias Correction of FedAvg over Nonuniform and Time-Varying Communications

no code implementations1 Jun 2023 Ming Xiang, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong, Lili Su

Specifically, in each round $t$, the link between the PS and client $i$ is active with probability $p_i^t$, which is $\textit{unknown}$ to both the PS and the clients.

Federated Learning

Federated Learning in the Presence of Adversarial Client Unavailability

no code implementations31 May 2023 Lili Su, Ming Xiang, Jiaming Xu, Pengkun Yang

Federated learning is a decentralized machine learning framework that enables collaborative model training without revealing raw data.

Federated Learning Selection bias

Distributed Non-Convex Optimization with One-Bit Compressors on Heterogeneous Data: Efficient and Resilient Algorithms

no code implementations3 Oct 2022 Ming Xiang, Lili Su

Federated Learning (FL) is a nascent decentralized learning framework under which a massive collection of heterogeneous clients collaboratively train a model without revealing their local data.

Federated Learning Privacy Preserving

Global Convergence of Federated Learning for Mixed Regression

no code implementations15 Jun 2022 Lili Su, Jiaming Xu, Pengkun Yang

This paper studies the problem of model training under Federated Learning when clients exhibit cluster structure.

Federated Learning regression

A Non-parametric View of FedAvg and FedProx: Beyond Stationary Points

no code implementations29 Jun 2021 Lili Su, Jiaming Xu, Pengkun Yang

We discover that when the data heterogeneity is moderate, a client with limited local data can benefit from a common model with a large federation gain.

Federated Learning regression

On Learning Over-parameterized Neural Networks: A Functional Approximation Perspective

no code implementations NeurIPS 2019 Lili Su, Pengkun Yang

When the network is sufficiently over-parameterized, these matrices individually approximate {\em an} integral operator which is determined by the feature vector distribution $\rho$ only.

Collaboratively Learning the Best Option, Using Bounded Memory

no code implementations22 Feb 2018 Lili Su, Martin Zubeldia, Nancy Lynch

We say an individual learns the best option if eventually (as $t \to \infty$) it pulls only the arm with the highest average reward.

Distributed Statistical Machine Learning in Adversarial Settings: Byzantine Gradient Descent

2 code implementations16 May 2017 Yudong Chen, Lili Su, Jiaming Xu

The total computational complexity of our algorithm is of $O((Nd/m) \log N)$ at each working machine and $O(md + kd \log^3 N)$ at the central server, and the total communication cost is of $O(m d \log N)$.

BIG-bench Machine Learning Federated Learning

Defending Non-Bayesian Learning against Adversarial Attacks

no code implementations28 Jun 2016 Lili Su, Nitin H. Vaidya

This paper addresses the problem of non-Bayesian learning over multi-agent networks, where agents repeatedly collect partially informative observations about an unknown state of the world, and try to collaboratively learn the true state.

Cannot find the paper you are looking for? You can Submit a new open access paper.