Search Results for author: Jingqiao Zhang

Found 5 papers, 2 papers with code

Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup

no code implementations27 Nov 2020 Cheng Yang, Shengnan Wang, Chao Yang, Yuechuan Li, Ru He, Jingqiao Zhang

In BERT training, the backward computation is much more time-consuming than the forward computation, especially in the distributed training setting in which the backward computation time further includes the communication time for gradient synchronization.

CoRe: An Efficient Coarse-refined Training Framework for BERT

no code implementations27 Nov 2020 Cheng Yang, Shengnan Wang, Yuechuan Li, Chao Yang, Ming Yan, Jingqiao Zhang, Fangquan Lin

In the second phase, we transform the trained relaxed BERT model into the original BERT and further retrain the model.

SAS: Self-Augmentation Strategy for Language Model Pre-training

1 code implementation14 Jun 2021 Yifei Xu, Jingqiao Zhang, Ru He, Liangzhu Ge, Chao Yang, Cheng Yang, Ying Nian Wu

In this paper, we propose a self-augmentation strategy (SAS) where a single network is utilized for both regular pre-training and contextualized data augmentation for the training in later epochs.

Data Augmentation Language Modelling +2

Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives

1 code implementation6 Jun 2022 Wei Wang, Liangzhu Ge, Jingqiao Zhang, Cheng Yang

Following SimCSE, contrastive learning based methods have achieved the state-of-the-art (SOTA) performance in learning sentence embeddings.

Attribute Contrastive Learning +5

GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce

no code implementations2 Jul 2022 Chao Yang, Ru He, Fangquan Lin, Suoyuan Song, Jingqiao Zhang, Cheng Yang

Our goal is to build general representation (embedding) for each user and each product item across Alibaba's businesses, including Taobao and Tmall which are among the world's biggest e-commerce websites.

Contrastive Learning Marketing

Cannot find the paper you are looking for? You can Submit a new open access paper.