Search Results for author: Pingcheng Dong

Found 3 papers, 2 papers with code

Boundary and Relation Distillation for Semantic Segmentation

no code implementations24 Jan 2024 Dong Zhang, Pingcheng Dong, Xinting Hu, Long Chen, Kwang-Ting Cheng

Concurrently, the relation distillation transfers implicit relations from the teacher model to the student model using pixel-level self-relation as a bridge, ensuring that the student's mask has strong target region connectivity.

Implicit Relations Knowledge Distillation +2

LLM-FP4: 4-Bit Floating-Point Quantized Transformers

1 code implementation25 Oct 2023 Shih-Yang Liu, Zechun Liu, Xijie Huang, Pingcheng Dong, Kwang-Ting Cheng

Our method, for the first time, can quantize both weights and activations in the LLaMA-13B to only 4-bit and achieves an average score of 63. 1 on the common sense zero-shot reasoning tasks, which is only 5. 8 lower than the full-precision model, significantly outperforming the previous state-of-the-art by 12. 7 points.

Common Sense Reasoning Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.