Search Results for author: Peng Su

Found 9 papers, 2 papers with code

Improving BERT Model Using Contrastive Learning for Biomedical Relation Extraction

1 code implementation28 Apr 2021 Peng Su, Yifan Peng, K. Vijay-Shanker

In this work, we explore the method of employing contrastive learning to improve the text representation from the BERT model for relation extraction.

Contrastive Learning Data Augmentation +1

Gradient Regularized Contrastive Learning for Continual Domain Adaptation

no code implementations23 Mar 2021 Shixiang Tang, Peng Su, Dapeng Chen, Wanli Ouyang

To better understand this issue, we study the problem of continual domain adaptation, where the model is presented with a labelled source domain and a sequence of unlabelled target domains.

Contrastive Learning Domain Adaptation

Modal Uncertainty Estimation via Discrete Latent Representations

no code implementations1 Jan 2021 Di Qiu, Zhanghan Ke, Peng Su, Lok Ming Lui

Many important problems in the real world don't have unique solutions.

Investigation of BERT Model on Biomedical Relation Extraction Based on Revised Fine-tuning Mechanism

no code implementations1 Nov 2020 Peng Su, K. Vijay-Shanker

In this paper, we will investigate the method of utilizing the entire layer in the fine-tuning process of BERT model.

Relation Classification

Contrastive Visual-Linguistic Pretraining

no code implementations26 Jul 2020 Lei Shi, Kai Shuang, Shijie Geng, Peng Su, Zhengkai Jiang, Peng Gao, Zuohui Fu, Gerard de Melo, Sen Su

We evaluate CVLP on several down-stream tasks, including VQA, GQA and NLVR2 to validate the superiority of contrastive learning on multi-modality representation learning.

Contrastive Learning Representation Learning +1

Gradient Regularized Contrastive Learning for Continual Domain Adaptation

no code implementations25 Jul 2020 Peng Su, Shixiang Tang, Peng Gao, Di Qiu, Ni Zhao, Xiaogang Wang

At the core of our method, gradient regularization plays two key roles: (1) enforces the gradient of contrastive loss not to increase the supervised training loss on the source domain, which maintains the discriminative power of learned features; (2) regularizes the gradient update on the new domain not to increase the classification loss on the old target domains, which enables the model to adapt to an in-coming target domain while preserving the performance of previously observed domains.

Contrastive Learning Domain Adaptation

Adversarial Learning for Supervised and Semi-supervised Relation Extraction in Biomedical Literature

no code implementations8 May 2020 Peng Su, K. Vijay-Shanker

Adversarial training is a technique of improving model performance by involving adversarial examples in the training process.

Relation Extraction

Adapting Object Detectors with Conditional Domain Normalization

no code implementations ECCV 2020 Peng Su, Kun Wang, Xingyu Zeng, Shixiang Tang, Dapeng Chen, Di Qiu, Xiaogang Wang

Then this domain-vector is used to encode the features from another domain through a conditional normalization, resulting in different domains' features carrying the same domain attribute.

3D Object Detection Unsupervised Domain Adaptation

Long-term Blood Pressure Prediction with Deep Recurrent Neural Networks

1 code implementation12 May 2017 Peng Su, Xiao-Rong Ding, Yuan-Ting Zhang, Jing Liu, Fen Miao, Ni Zhao

Existing methods for arterial blood pressure (BP) estimation directly map the input physiological signals to output BP values without explicitly modeling the underlying temporal dependencies in BP dynamics.

Blood pressure estimation Electrocardiography (ECG)

Cannot find the paper you are looking for? You can Submit a new open access paper.