Search Results for author: Guangyi Lv

Found 7 papers, 1 papers with code

Description-Enhanced Label Embedding Contrastive Learning for Text Classification

1 code implementation15 Jun 2023 Kun Zhang, Le Wu, Guangyi Lv, Enhong Chen, Shulan Ruan, Jing Liu, Zhiqiang Zhang, Jun Zhou, Meng Wang

Then, we propose a novel Relation of Relation Learning Network (R2-Net) for text classification, in which text classification and R2 classification are treated as optimization targets.

Contrastive Learning Relation +3

Skeptical Deep Learning with Distribution Correction

no code implementations9 Nov 2018 Mingxiao An, Yongzhou Chen, Qi Liu, Chuanren Liu, Guangyi Lv, Fangzhao Wu, Jianhui Ma

Recently deep neural networks have been successfully used for various classification tasks, especially for problems with massive perfectly labeled training data.

Classification General Classification

Deep Technology Tracing for High-tech Companies

no code implementations2 Jan 2020 Han Wu, Kun Zhang, Guangyi Lv, Qi Liu, Runlong Yu, Weihao Zhao, Enhong Chen, Jianhui Ma

Technological change and innovation are vitally important, especially for high-tech companies.

Vocal Bursts Intensity Prediction

R$^2$-Net: Relation of Relation Learning Network for Sentence Semantic Matching

no code implementations16 Dec 2020 Kun Zhang, Le Wu, Guangyi Lv, Meng Wang, Enhong Chen, Shulan Ruan

Sentence semantic matching is one of the fundamental tasks in natural language processing, which requires an agent to determine the semantic relation among input sentences.

Relation Relation Classification +1

DGA-Net Dynamic Gaussian Attention Network for Sentence Semantic Matching

no code implementations9 Jun 2021 Kun Zhang, Guangyi Lv, Meng Wang, Enhong Chen

Then, we develop a Dynamic Gaussian Attention (DGA) to dynamically capture the important parts and corresponding local contexts from a detailed perspective.

Language Modelling Relation +2

LadRa-Net: Locally-Aware Dynamic Re-read Attention Net for Sentence Semantic Matching

no code implementations6 Aug 2021 Kun Zhang, Guangyi Lv, Le Wu, Enhong Chen, Qi Liu, Meng Wang

In order to overcome this problem and boost the performance of attention mechanism, we propose a novel dynamic re-read attention, which can pay close attention to one small region of sentences at each step and re-read the important parts for better sentence representations.

Language Modelling Natural Language Inference +2

Cannot find the paper you are looking for? You can Submit a new open access paper.