no code implementations • 9 Mar 2024 • Hengyuan Xu, Liyao Xiang, Xingjun Ma, Borui Yang, Baochun Li
The permutation equivariance ensures minimal interference between these two sets of model weights and thus high fidelity on downstream tasks.
no code implementations • 26 Feb 2024 • Tianhang Zheng, Baochun Li
Federated learning has recently emerged as a decentralized approach to learn a high-performance model without access to user data.
no code implementations • 17 Feb 2024 • Sijia Chen, Baochun Li, Di Niu
The reasoning performance of Large Language Models (LLMs) on a wide range of problems critically relies on chain-of-thought prompting, which involves providing a few chain of thought demonstrations as exemplars in prompts.
no code implementations • 18 Aug 2023 • Sijia Chen, Baochun Li
Specifically, we propose a language-guided diffusion framework for visual grounding, LG-DVG, which trains the model to progressively reason queried object boxes by denoising a set of noisy boxes with the language guide.
no code implementations • 13 Oct 2022 • Peng Ye, Zhifeng Jiang, Wei Wang, Bo Li, Baochun Li
To address this problem, we develop a novel feature protection scheme against the reconstruction attack that effectively misleads the search to some pre-specified random values.
no code implementations • 18 Jun 2022 • Zhifeng Jiang, Wei Wang, Baochun Li, Bo Li
Current FL systems employ a participant selection strategy to select fast clients with quality data in each iteration.
1 code implementation • CVPR 2022 • WanYu Lin, Hao Lan, Hao Wang, Baochun Li
This paper proposes a new eXplanation framework, called OrphicX, for generating causal explanations for any graph neural networks (GNNs) based on learned latent causal factors.
no code implementations • 23 Jan 2022 • WanYu Lin, Baochun Li, Cong Wang
It is typical to collect these local views of social graphs and conduct graph learning tasks.
1 code implementation • CVPR 2022 • Sijia Chen, Baochun Li
We found that existing VG methods are trapped by the single-stage grounding process that performs a sole evaluate-and- rank for meticulously prepared regions.
no code implementations • 29 Sep 2021 • Sijia Chen, Baochun Li
In this paper, we point out that this issue can be addressed by balancing information flow from the initial model and training dataset to the local adaptation.
no code implementations • 29 Sep 2021 • Tianhang Zheng, Baochun Li
In this paper, we show that PGD-2 AT with random initialization (PGD-2-RS AT) and attack step size $\alpha=1. 25\epsilon/2$ only needs approximately a half computational cost of FGSM + GradAlign AT and actually can avoid catastrophic overfitting for large $\ell_\infty$ perturbations.
no code implementations • 29 Sep 2021 • WanYu Lin, Hao Lan, Hao Wang, Baochun Li
This paper proposes a new explanation framework, called OrphicX, for generating causal explanations for any graph neural networks (GNNs) based on learned latent causal factors.
1 code implementation • 14 Apr 2021 • WanYu Lin, Hao Lan, Baochun Li
Specifically, we formulate the problem of providing explanations for the decisions of GNNs as a causal learning task.
no code implementations • International Conference on Distributed Computing Systems 2020 • Lingdong Wang, Liyao Xiang, Jiayu Xu, Jiaju Chen, Xing Zhao, Dixi Yao, Xinbing Wang, Baochun Li
While deep neural networks (DNNs) have led to a paradigm shift, its exorbitant computational requirement has always been a roadblock in its deployment to the edge, such as wearable devices and smartphones.
Ranked #169 on Image Classification on CIFAR-10
no code implementations • 15 May 2020 • Tianhang Zheng, Di Wang, Baochun Li, Jinhui Xu
Based on our framework, we assess the Gaussian and Exponential mechanisms by comparing the magnitude of additive noise required by these mechanisms and the lower bounds (criteria).
no code implementations • 14 May 2020 • Tianhang Zheng, Sheng Liu, Changyou Chen, Junsong Yuan, Baochun Li, Kui Ren
We first formulate generation of adversarial skeleton actions as a constrained optimization problem by representing or approximating the physiological and physical constraints with mathematical formulations.
no code implementations • Network and Distributed Systems Security (NDSS) Symposium 2020 • Zhongjie Ba, Tianhang Zheng, Xinyu Zhang, Zhan Qin, Baochun Li, Xue Liu and Kui Ren
The second limitation comes from a common sense that these sensors can only pick up a narrow band (85-100Hz) of speech signals due to a sampling ceiling of 200Hz.
1 code implementation • 28 Oct 2019 • Wan-Yu Lin, Zhaolin Gao, Baochun Li
More specifically, we address the problem of graph-based semi-supervised learning in the presence of severely limited labeled samples, and propose a new framework, called {\em Shoestring}, that improves the learning performance through semantic transfer from these very few labeled samples to large numbers of unlabeled samples.
no code implementations • 25 Sep 2019 • Tianhang Zheng, Di Wang, Baochun Li, Jinhui Xu
We answer the above two questions by first demonstrating that Gaussian mechanism and Exponential mechanism are the (near) optimal options to certify the $\ell_2$ and $\ell_\infty$-normed robustness.
no code implementations • NeurIPS 2018 • Yuanxiang Gao, Li Chen, Baochun Li
It is critical to place operations in a neural network on these devices in an optimal way, so that the training process can complete within the shortest amount of time.
no code implementations • ICML 2018 • Yuanxiang Gao, Li Chen, Baochun Li
Training deep neural networks (DNNs) requires an increasing amount of computation resources, and it becomes typical to use a mixture of GPU and CPU devices.
no code implementations • 7 Jun 2018 • Chen Chen, Qizhen Weng, Wei Wang, Baochun Li, Bo Li
Efficient model training requires eliminating such stragglers, yet for modern ML workloads, existing load balancing strategies are inefficient and even infeasible.