no code implementations • 22 Apr 2024 • Marie Siew, Haoran Zhang, Jong-Ik Park, Yuezhou Liu, Yichen Ruan, Lili Su, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong
We show how our fairness-based learning and incentive mechanisms impact training convergence and finally evaluate our algorithm with multiple sets of learning tasks on real world datasets.
no code implementations • 15 Apr 2024 • Ming Xiang, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong, Lili Su
It consists of a parameter server and a possibly large collection of clients (e. g., in cross-device federated learning) that may operate in congested and changing environments.
1 code implementation • 12 Mar 2024 • Mingze Wang, Lili Su, Cilin Yan, Sheng Xu, Pengcheng Yuan, XiaoLong Jiang, Baochang Zhang
RSBuilding is designed to enhance cross-scene generalization and task universality.
no code implementations • 23 Aug 2023 • Jiangwei Wang, Lili Su, Songyang Han, Dongjin Song, Fei Miao
Then through extensive experiments on SUMO simulator, we show that our proposed algorithm has great detection performance in both highway and urban traffic.
no code implementations • 27 Jul 2023 • Connor Mclaughlin, Matthew Ding, Denis Edogmus, Lili Su
On network communication, we consider packet-dropping link failures.
no code implementations • 29 Jun 2023 • Connor Mclaughlin, Matthew Ding, Deniz Edogmus, Lili Su
In both algorithms, we use a novel hierarchical push-sum consensus component.
no code implementations • 1 Jun 2023 • Ming Xiang, Stratis Ioannidis, Edmund Yeh, Carlee Joe-Wong, Lili Su
Specifically, in each round $t$, the link between the PS and client $i$ is active with probability $p_i^t$, which is $\textit{unknown}$ to both the PS and the clients.
no code implementations • 31 May 2023 • Lili Su, Ming Xiang, Jiaming Xu, Pengkun Yang
Federated learning is a decentralized machine learning framework that enables collaborative model training without revealing raw data.
no code implementations • 8 Mar 2023 • Muzi Peng, Jiangwei Wang, Dongjin Song, Fei Miao, Lili Su
Deep learning is the method of choice for trajectory prediction for autonomous vehicles.
no code implementations • 3 Oct 2022 • Ming Xiang, Lili Su
Federated Learning (FL) is a nascent decentralized learning framework under which a massive collection of heterogeneous clients collaboratively train a model without revealing their local data.
no code implementations • 15 Jun 2022 • Lili Su, Jiaming Xu, Pengkun Yang
This paper studies the problem of model training under Federated Learning when clients exhibit cluster structure.
no code implementations • 29 Jun 2021 • Lili Su, Jiaming Xu, Pengkun Yang
We discover that when the data heterogeneity is moderate, a client with limited local data can benefit from a common model with a large federation gain.
no code implementations • NeurIPS 2019 • Lili Su, Pengkun Yang
When the network is sufficiently over-parameterized, these matrices individually approximate {\em an} integral operator which is determined by the feature vector distribution $\rho$ only.
no code implementations • 26 Apr 2018 • Lili Su, Jiaming Xu
Nevertheless, the empirical risk (sample version) is allowed to be non-convex.
no code implementations • 22 Feb 2018 • Lili Su, Martin Zubeldia, Nancy Lynch
We say an individual learns the best option if eventually (as $t \to \infty$) it pulls only the arm with the highest average reward.
2 code implementations • 16 May 2017 • Yudong Chen, Lili Su, Jiaming Xu
The total computational complexity of our algorithm is of $O((Nd/m) \log N)$ at each working machine and $O(md + kd \log^3 N)$ at the central server, and the total communication cost is of $O(m d \log N)$.
no code implementations • 28 Jun 2016 • Lili Su, Nitin H. Vaidya
This paper addresses the problem of non-Bayesian learning over multi-agent networks, where agents repeatedly collect partially informative observations about an unknown state of the world, and try to collaboratively learn the true state.