Search Results for author: Peiqin Sun

Found 5 papers, 3 papers with code

Three Guidelines You Should Know for Universally Slimmable Self-Supervised Learning

1 code implementation CVPR 2023 Yun-Hao Cao, Peiqin Sun, Shuchang Zhou

We propose universally slimmable self-supervised learning (dubbed as US3L) to achieve better accuracy-efficiency trade-offs for deploying self-supervised models across different devices.

Instance Segmentation object-detection +3

Synergistic Self-supervised and Quantization Learning

1 code implementation12 Jul 2022 Yun-Hao Cao, Peiqin Sun, Yechang Huang, Jianxin Wu, Shuchang Zhou

In this paper, we propose a method called synergistic self-supervised and quantization learning (SSQL) to pretrain quantization-friendly self-supervised models facilitating downstream deployment.

Quantization Self-Supervised Learning

FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer

1 code implementation27 Nov 2021 Yang Lin, Tianyu Zhang, Peiqin Sun, Zheng Li, Shuchang Zhou

Network quantization significantly reduces model inference complexity and has been widely used in real-world deployments.

Quantization

Optimal Quantization for Batch Normalization in Neural Network Deployments and Beyond

no code implementations30 Aug 2020 Dachao Lin, Peiqin Sun, Guangzeng Xie, Shuchang Zhou, Zhihua Zhang

Quantized Neural Networks (QNNs) use low bit-width fixed-point numbers for representing weight parameters and activations, and are often used in real-world applications due to their saving of computation resources and reproducibility of results.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.