Search Results for author: Zhijun Tu

Found 11 papers, 6 papers with code

Effective Diffusion Transformer Architecture for Image Super-Resolution

1 code implementation29 Sep 2024 Kun Cheng, Lei Yu, Zhijun Tu, Xiao He, Liyu Chen, Yong Guo, Mingrui Zhu, Nannan Wang, Xinbo Gao, Jie Hu

In this work, we design an effective diffusion transformer for image super-resolution (DiT-SR) that achieves the visual quality of prior-based methods, but through a training-from-scratch manner.

Image Generation Image Super-Resolution

One Step Diffusion-based Super-Resolution with Time-Aware Distillation

1 code implementation14 Aug 2024 Xiao He, Huaao Tang, Zhijun Tu, Junchao Zhang, Kun Cheng, Hanting Chen, Yong Guo, Mingrui Zhu, Nannan Wang, Xinbo Gao, Jie Hu

Specifically, we introduce a novel score distillation strategy to align the data distribution between the outputs of the student and teacher models after minor noise perturbation.

Image Super-Resolution Knowledge Distillation

U-DiTs: Downsample Tokens in U-Shaped Diffusion Transformers

1 code implementation4 May 2024 Yuchuan Tian, Zhijun Tu, Hanting Chen, Jie Hu, Chao Xu, Yunhe Wang

Diffusion Transformers (DiTs) introduce the transformer architecture to diffusion tasks for latent-space image generation.

Image Generation Inductive Bias

LIPT: Latency-aware Image Processing Transformer

no code implementations9 Apr 2024 Junbo Qiao, Wei Li, Haizhen Xie, Hanting Chen, Yunshuai Zhou, Zhijun Tu, Jie Hu, Shaohui Lin

Extensive experiments on multiple image processing tasks (e. g., image super-resolution (SR), JPEG artifact reduction, and image denoising) demonstrate the superiority of LIPT on both latency and PSNR.

Image Denoising Image Super-Resolution

IPT-V2: Efficient Image Processing Transformer using Hierarchical Attentions

no code implementations31 Mar 2024 Zhijun Tu, Kunpeng Du, Hanting Chen, Hailing Wang, Wei Li, Jie Hu, Yunhe Wang

Recent advances have demonstrated the powerful capability of transformer architecture in image restoration.

Deblurring Denoising +3

A Survey on Transformer Compression

no code implementations5 Feb 2024 Yehui Tang, Yunhe Wang, Jianyuan Guo, Zhijun Tu, Kai Han, Hailin Hu, DaCheng Tao

Model compression methods reduce the memory and computational cost of Transformer, which is a necessary step to implement large language/vision models on practical devices.

Knowledge Distillation Mamba +3

CBQ: Cross-Block Quantization for Large Language Models

no code implementations13 Dec 2023 Xin Ding, Xiaoyu Liu, Zhijun Tu, Yun Zhang, Wei Li, Jie Hu, Hanting Chen, Yehui Tang, Zhiwei Xiong, Baoqun Yin, Yunhe Wang

Post-training quantization (PTQ) has played a key role in compressing large language models (LLMs) with ultra-low costs.

Quantization

Data Upcycling Knowledge Distillation for Image Super-Resolution

1 code implementation25 Sep 2023 Yun Zhang, Wei Li, Simiao Li, Hanting Chen, Zhijun Tu, Wenjia Wang, BingYi Jing, Shaohui Lin, Jie Hu

Knowledge distillation (KD) compresses deep neural networks by transferring task-related knowledge from cumbersome pre-trained teacher models to compact student models.

Image Super-Resolution Knowledge Distillation +1

Toward Accurate Post-Training Quantization for Image Super Resolution

2 code implementations CVPR 2023 Zhijun Tu, Jie Hu, Hanting Chen, Yunhe Wang

In this paper, we study post-training quantization(PTQ) for image super resolution using only a few unlabeled calibration images.

Image Super-Resolution Quantization

AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets

3 code implementations17 Aug 2022 Zhijun Tu, Xinghao Chen, Pengju Ren, Yunhe Wang

Since the modern deep neural networks are of sophisticated design with complex architecture for the accuracy reason, the diversity on distributions of weights and activations is very high.

Classification with Binary Neural Network Quantization

Superconductivity and normal-state properties of kagome metal RbV3Sb5 single crystals

no code implementations25 Jan 2021 Qiangwei Yin, Zhijun Tu, Chunsheng Gong, Yang Fu, Shaohua Yan, Hechang Lei

We report the discovery of superconductivity and detailed normal-state physical properties of RbV3Sb5 single crystals with V kagome lattice.

Superconductivity Materials Science

Cannot find the paper you are looking for? You can Submit a new open access paper.