Search Results for author: Tian Li

Found 32 papers, 18 papers with code

Panoptic Perception: A Novel Task and Fine-grained Dataset for Universal Remote Sensing Image Interpretation

no code implementations6 Apr 2024 Danpei Zhao, Bo Yuan, Ziqiang Chen, Tian Li, Zhuoran Liu, Wentao Li, Yue Gao

Experimental results on FineGrip demonstrate the feasibility of the panoptic perception task and the beneficial effect of multi-task joint optimization on individual tasks.

Image Captioning Instance Segmentation +4

Many-Objective Multi-Solution Transport

no code implementations6 Mar 2024 Ziyue Li, Tian Li, Virginia Smith, Jeff Bilmes, Tianyi Zhou

Optimizing the performance of many objectives (instantiated by tasks or clients) jointly with a few Pareto stationary solutions (models) is critical in machine learning.

Federated Learning Multi-Task Learning

DEED: Dynamic Early Exit on Decoder for Accelerating Encoder-Decoder Transformer Models

no code implementations15 Nov 2023 Peng Tang, Pengkai Zhu, Tian Li, Srikar Appalaraju, Vijay Mahadevan, R. Manmatha

Based on the multi-exit model, we perform step-level dynamic early exit during inference, where the model may decide to use fewer decoder layers based on its confidence of the current layer at each individual decoding step.


Dehazing-NeRF: Neural Radiance Fields from Hazy Images

no code implementations22 Apr 2023 Tian Li, Lu Li, Wei Wang, Zhangchi Feng

Our method simulates the physical imaging process of hazy images using an atmospheric scattering model, and jointly learns the atmospheric scattering model and a clean NeRF model for both image dehazing and novel view synthesis.

3D Scene Reconstruction Image Dehazing +3

Differentially Private Adaptive Optimization with Delayed Preconditioners

1 code implementation1 Dec 2022 Tian Li, Manzil Zaheer, Ken Ziyu Liu, Sashank J. Reddi, H. Brendan McMahan, Virginia Smith

Privacy noise may negate the benefits of using adaptive optimizers in differentially private model training.

Maximizing Global Model Appeal in Federated Learning

no code implementations30 May 2022 Yae Jee Cho, Divyansh Jhunjhunwala, Tian Li, Virginia Smith, Gauri Joshi

We provide convergence guarantees for MaxFL and show that MaxFL achieves a $22$-$40\%$ and $18$-$50\%$ test accuracy improvement for the training clients and unseen clients respectively, compared to a wide range of FL modeling approaches, including those that tackle data heterogeneity, aim to incentivize clients, and learn personalized or fair models.

Federated Learning

PreTraM: Self-Supervised Pre-training via Connecting Trajectory and Map

1 code implementation21 Apr 2022 Chenfeng Xu, Tian Li, Chen Tang, Lingfeng Sun, Kurt Keutzer, Masayoshi Tomizuka, Alireza Fathi, Wei Zhan

It is hard to replicate these approaches in trajectory forecasting due to the lack of adequate trajectory data (e. g., 34K samples in the nuScenes dataset).

Contrastive Learning Representation Learning +1

Private Adaptive Optimization with Side Information

1 code implementation12 Feb 2022 Tian Li, Manzil Zaheer, Sashank J. Reddi, Virginia Smith

Adaptive optimization methods have become the default solvers for many machine learning tasks.

Diverse Client Selection for Federated Learning via Submodular Maximization

no code implementations ICLR 2022 Ravikumar Balakrishnan, Tian Li, Tianyi Zhou, Nageen Himayat, Virginia Smith, Jeff Bilmes

In every communication round of federated learning, a random subset of clients communicate their model updates back to the server which then aggregates them all.

Fairness Federated Learning

On Tilted Losses in Machine Learning: Theory and Applications

1 code implementation13 Sep 2021 Tian Li, Ahmad Beirami, Maziar Sanjabi, Virginia Smith

Finally, we demonstrate that TERM can be used for a multitude of applications in machine learning, such as enforcing fairness between subgroups, mitigating the effect of outliers, and handling class imbalance.

BIG-bench Machine Learning Fairness +1

Heterogeneity for the Win: One-Shot Federated Clustering

2 code implementations1 Mar 2021 Don Kurian Dennis, Tian Li, Virginia Smith

In this work, we explore the unique challenges -- and opportunities -- of unsupervised federated learning (FL).

Clustering Federated Learning

Experimental study of decoherence of the two-mode squeezed vacuum state via second harmonic generation

no code implementations22 Dec 2020 Fu Li, Tian Li, Girish S. Agarwal

Such a correlation is the most important characteristic of a two-mode squeezed state.

Optics Quantum Physics

Cross-Domain Sentiment Classification with Contrastive Learning and Mutual Information Maximization

1 code implementation30 Oct 2020 Tian Li, Xiang Chen, Shanghang Zhang, Zhen Dong, Kurt Keutzer

Due to scarcity of labels on the target domain, we introduce mutual information maximization (MIM) apart from CL to exploit the features that best support the final prediction.

Contrastive Learning General Classification +3

Tilted Empirical Risk Minimization

2 code implementations ICLR 2021 Tian Li, Ahmad Beirami, Maziar Sanjabi, Virginia Smith

Empirical risk minimization (ERM) is typically designed to perform well on the average loss, which can result in estimators that are sensitive to outliers, generalize poorly, or treat subgroups unfairly.


Enhancing the Privacy of Federated Learning with Sketching

no code implementations5 Nov 2019 Zaoxing Liu, Tian Li, Virginia Smith, Vyas Sekar

Federated learning methods run training tasks directly on user devices and do not share the raw user data with third parties.

Federated Learning

Federated Learning: Challenges, Methods, and Future Directions

1 code implementation21 Aug 2019 Tian Li, Anit Kumar Sahu, Ameet Talwalkar, Virginia Smith

Federated learning involves training statistical models over remote devices or siloed data centers, such as mobile phones or hospitals, while keeping data localized.

BIG-bench Machine Learning Distributed Optimization +2

Federated Optimization in Heterogeneous Networks

19 code implementations14 Dec 2018 Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, Virginia Smith

Theoretically, we provide convergence guarantees for our framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work (systems heterogeneity).

Distributed Optimization Federated Learning

LEAF: A Benchmark for Federated Settings

7 code implementations3 Dec 2018 Sebastian Caldas, Sai Meher Karthik Duddu, Peter Wu, Tian Li, Jakub Konečný, H. Brendan McMahan, Virginia Smith, Ameet Talwalkar

Modern federated networks, such as those comprised of wearable devices, mobile phones, or autonomous vehicles, generate massive amounts of data each day.

Autonomous Vehicles Benchmarking +3 Towards Multi-tenant Resource Sharing for Machine Learning Workloads

no code implementations24 Aug 2017 Tian Li, Jie Zhong, Ji Liu, Wentao Wu, Ce Zhang

We ask, as a "service provider" that manages a shared cluster of machines among all our users running machine learning workloads, what is the resource allocation strategy that maximizes the global satisfaction of all our users?

Bayesian Optimization BIG-bench Machine Learning +4

A Multilingual Natural Stress Emotion Database

no code implementations LREC 2012 Xin Zuo, Tian Li, Pascale Fung

In this paper, we describe an ongoing effort in collecting and annotating a multilingual speech database of natural stress emotion from university students.

Emotion Recognition Speech Synthesis

Cannot find the paper you are looking for? You can Submit a new open access paper.