Search Results for author: Zhenman Fang

Found 6 papers, 2 papers with code

Quasar-ViT: Hardware-Oriented Quantization-Aware Architecture Search for Vision Transformers

no code implementations25 Jul 2024 Zhengang Li, Alec Lu, Yanyue Xie, Zhenglun Kong, Mengshu Sun, Hao Tang, Zhong Jia Xue, Peiyan Dong, Caiwen Ding, Yanzhi Wang, Xue Lin, Zhenman Fang

This work proposes Quasar-ViT, a hardware-oriented quantization-aware architecture search framework for ViTs, to design efficient ViT models for hardware implementation while preserving the accuracy.

Quantization

HeatViT: Hardware-Efficient Adaptive Token Pruning for Vision Transformers

no code implementations15 Nov 2022 Peiyan Dong, Mengshu Sun, Alec Lu, Yanyue Xie, Kenneth Liu, Zhenglun Kong, Xin Meng, Zhengang Li, Xue Lin, Zhenman Fang, Yanzhi Wang

While vision transformers (ViTs) have continuously achieved new milestones in the field of computer vision, their sophisticated network architectures with high computation and memory costs have impeded their deployment on resource-limited edge devices.

Quantization

SuperYOLO: Super Resolution Assisted Object Detection in Multimodal Remote Sensing Imagery

1 code implementation27 Sep 2022 Jiaqing Zhang, Jie Lei, Weiying Xie, Zhenman Fang, Yunsong Li, Qian Du

Furthermore, we design a simple and flexible SR branch to learn HR feature representations that can discriminate small objects from vast backgrounds with low-resolution (LR) input, thus further improving the detection accuracy.

Real-Time Object Detection Small Object Detection +1

Auto-ViT-Acc: An FPGA-Aware Automatic Acceleration Framework for Vision Transformer with Mixed-Scheme Quantization

no code implementations10 Aug 2022 Zhengang Li, Mengshu Sun, Alec Lu, Haoyu Ma, Geng Yuan, Yanyue Xie, Hao Tang, Yanyu Li, Miriam Leeser, Zhangyang Wang, Xue Lin, Zhenman Fang

Compared with state-of-the-art ViT quantization work (algorithmic approach only without hardware acceleration), our quantization achieves 0. 47% to 1. 36% higher Top-1 accuracy under the same bit-width.

Quantization

FitAct: Error Resilient Deep Neural Networks via Fine-Grained Post-Trainable Activation Functions

1 code implementation27 Dec 2021 Behnam Ghavami, Mani Sadati, Zhenman Fang, Lesley Shannon

Deep neural networks (DNNs) are increasingly being deployed in safety-critical systems such as personal healthcare devices and self-driving cars.

Self-Driving Cars

BDFA: A Blind Data Adversarial Bit-flip Attack on Deep Neural Networks

no code implementations7 Dec 2021 Behnam Ghavami, Mani Sadati, Mohammad Shahidzadeh, Zhenman Fang, Lesley Shannon

Adversarial bit-flip attack (BFA) on Neural Network weights can result in catastrophic accuracy degradation by flipping a very small number of bits.

Cannot find the paper you are looking for? You can Submit a new open access paper.