Search Results for author: Ruikun Luo

Found 2 papers, 0 papers with code

Bi-Granularity Contrastive Learning for Post-Training in Few-Shot Scene

no code implementations Findings (ACL) 2021 Ruikun Luo, Guanhuan Huang, Xiaojun Quan

The major paradigm of applying a pre-trained language model to downstream tasks is to fine-tune it on labeled task data, which often suffers instability and low performance when the labeled examples are scarce.~One way to alleviate this problem is to apply post-training on unlabeled task data before fine-tuning, adapting the pre-trained model to target domains by contrastive learning that considers either token-level or sequence-level similarity.

Contrastive Learning Data Augmentation +2

Multi-Task Regularization with Covariance Dictionary for Linear Classifiers

no code implementations21 Oct 2013 Fanyi Xiao, Ruikun Luo, Zhiding Yu

In this paper we propose a multi-task linear classifier learning problem called D-SVM (Dictionary SVM).

Transfer Learning valid

Cannot find the paper you are looking for? You can Submit a new open access paper.