Search Results for author: Sicheng Li

Found 9 papers, 3 papers with code

ECNet: Effective Controllable Text-to-Image Diffusion Models

no code implementations27 Mar 2024 Sicheng Li, Keqiang Sun, Zhixin Lai, Xiaoshi Wu, Feng Qiu, Haoran Xie, Kazunori Miyata, Hongsheng Li

Secondly, to overcome the issue of limited conditional supervision, we introduce Diffusion Consistency Loss (DCL), which applies supervision on the denoised latent code at any given time step.

Denoising Text-to-Image Generation

SteerNeRF: Accelerating NeRF Rendering via Smooth Viewpoint Trajectory

no code implementations CVPR 2023 Sicheng Li, Hao Li, Yue Wang, Yiyi Liao, Lu Yu

Neural Radiance Fields (NeRF) have demonstrated superior novel view synthesis performance but are slow at rendering.

Novel View Synthesis

Multi-Scale Spatial Temporal Graph Convolutional Network for Skeleton-Based Action Recognition

1 code implementation27 Jun 2022 Zhan Chen, Sicheng Li, Bing Yang, Qinghan Li, Hong Liu

To solve this problem, we present a multi-scale spatial graph convolution (MS-GC) module and a multi-scale temporal graph convolution (MT-GC) module to enrich the receptive field of the model in spatial and temporal dimensions.

Skeleton Based Action Recognition

MIA-Former: Efficient and Robust Vision Transformers via Multi-grained Input-Adaptation

no code implementations21 Dec 2021 Zhongzhi Yu, Yonggan Fu, Sicheng Li, Chaojian Li, Yingyan Lin

ViTs are often too computationally expensive to be fitted onto real-world resource-constrained devices, due to (1) their quadratically increased complexity with the number of input tokens and (2) their overparameterized self-attention heads and model depth.

LotteryFL: Personalized and Communication-Efficient Federated Learning with Lottery Ticket Hypothesis on Non-IID Datasets

1 code implementation7 Aug 2020 Ang Li, Jingwei Sun, Binghui Wang, Lin Duan, Sicheng Li, Yiran Chen, Hai Li

Rather than learning a shared global model in classic federated learning, each client learns a personalized model via LotteryFL; the communication cost can be significantly reduced due to the compact size of lottery networks.

Federated Learning

MAT: A Multi-strength Adversarial Training Method to Mitigate Adversarial Attacks

no code implementations27 May 2017 Chang Song, Hsin-Pai Cheng, Huanrui Yang, Sicheng Li, Chunpeng Wu, Qing Wu, Hai Li, Yiran Chen

Our experiments show that different adversarial strengths, i. e., perturbation levels of adversarial examples, have different working zones to resist the attack.

Cannot find the paper you are looking for? You can Submit a new open access paper.