Search Results for author: Haocheng Xi

Found 3 papers, 2 papers with code

Jetfire: Efficient and Accurate Transformer Pretraining with INT8 Data Flow and Per-Block Quantization

no code implementations19 Mar 2024 Haocheng Xi, Yuxiang Chen, Kang Zhao, Kaijun Zheng, Jianfei Chen, Jun Zhu

Moreover, for a standard transformer block, our method offers an end-to-end training speedup of 1. 42x and a 1. 49x memory reduction compared to the FP16 baseline.

Quantization

T-Rex: Text-assisted Retrosynthesis Prediction

1 code implementation26 Jan 2024 Yifeng Liu, Hanwen Xu, Tangqi Fang, Haocheng Xi, Zixuan Liu, Sheng Zhang, Hoifung Poon, Sheng Wang

As a fundamental task in computational chemistry, retrosynthesis prediction aims to identify a set of reactants to synthesize a target molecule.

Re-Ranking Retrosynthesis

Training Transformers with 4-bit Integers

1 code implementation NeurIPS 2023 Haocheng Xi, Changhao Li, Jianfei Chen, Jun Zhu

To achieve this, we carefully analyze the specific structures of activation and gradients in transformers to propose dedicated quantizers for them.

Image Classification Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.