Search Results for author: Haiqin Yang

Found 22 papers, 1 papers with code

Vision-and-Language Pretrained Models: A Survey

no code implementations15 Apr 2022 Siqu Long, Feiqi Cao, Soyeon Caren Han, Haiqin Yang

Pretrained models have produced great success in both Computer Vision (CV) and Natural Language Processing (NLP).

Computer Vision Natural Language Processing

RefBERT: Compressing BERT by Referencing to Pre-computed Representations

no code implementations11 Jun 2021 Xinyi Wang, Haiqin Yang, Liang Zhao, Yang Mo, Jianping Shen

Differently, in this paper, we propose RefBERT to leverage the knowledge learned from the teacher, i. e., facilitating the pre-computed BERT representation on the reference sample and compressing BERT into a smaller student model.

Knowledge Distillation Natural Language Processing

Progressive Open-Domain Response Generation with Multiple Controllable Attributes

no code implementations7 Jun 2021 Haiqin Yang, Xiaoyuan Yao, Yiqun Duan, Jianping Shen, Jie Zhong, Kun Zhang

More specifically, PHED deploys Conditional Variational AutoEncoder (CVAE) on Transformer to include one aspect of attributes at one stage.

Response Generation

Sattiy at SemEval-2021 Task 9: An Ensemble Solution for Statement Verification and Evidence Finding with Tables

no code implementations SEMEVAL 2021 Xiaoyi Ruan, Meizhi Jin, Jian Ma, Haiqin Yang, Lianxin Jiang, Yang Mo, Mengyuan Zhou

Question answering from semi-structured tables can be seen as a semantic parsing task and is significant and practical for pushing the boundary of natural language understanding.

Natural Language Understanding Question Answering +1

PALI at SemEval-2021 Task 2: Fine-Tune XLM-RoBERTa for Word in Context Disambiguation

no code implementations SEMEVAL 2021 Shuyi Xie, Jian Ma, Haiqin Yang, Lianxin Jiang, Yang Mo, Jianping Shen

Second, we construct a new vector on the fine-tuned embeddings from XLM-RoBERTa and feed it to a fully-connected network to output the probability of whether the target word in the context has the same meaning or not.

Data Augmentation TAG

Emotion Dynamics Modeling via BERT

no code implementations15 Apr 2021 Haiqin Yang, Jianping Shen

Emotion dynamics modeling is a significant task in emotion recognition in conversation.

Emotion Recognition in Conversation Representation Learning

Automatic Intent-Slot Induction for Dialogue Systems

no code implementations16 Mar 2021 Zengfeng Zeng, Dan Ma, Haiqin Yang, Zhen Gou, Jianping Shen

Automatically and accurately identifying user intents and filling the associated slots from their spoken language are critical to the success of dialogue systems.

Intent Detection Slot Filling

Making Online Sketching Hashing Even Faster

no code implementations10 Oct 2020 Xixian Chen, Haiqin Yang, Shenglin Zhao, Michael R. Lyu, Irwin King

Data-dependent hashing methods have demonstrated good performance in various machine learning applications to learn a low-dimensional representation from the original data.

Block-term Tensor Neural Networks

no code implementations10 Oct 2020 Jinmian Ye, Guangxi Li, Di Chen, Haiqin Yang, Shandian Zhe, Zenglin Xu

Deep neural networks (DNNs) have achieved outstanding performance in a wide range of applications, e. g., image classification, natural language processing, etc.

Image Classification Natural Language Processing

Effective Data-aware Covariance Estimator from Compressed Data

no code implementations10 Oct 2020 Xixian Chen, Haiqin Yang, Shenglin Zhao, Michael R. Lyu, Irwin King

Estimating covariance matrix from massive high-dimensional and distributed data is significant for various real-world applications.

Hierarchical Context Enhanced Multi-Domain Dialogue System for Multi-domain Task Completion

no code implementations3 Mar 2020 Jingyuan Yang, Guang Liu, Yuzhao Mao, Zhiwei Zhao, Weiguo Gao, Xuan Li, Haiqin Yang, Jianping Shen

Task 1 of the DSTC8-track1 challenge aims to develop an end-to-end multi-domain dialogue system to accomplish complex users' goals under tourist information desk settings.

BERT Meets Chinese Word Segmentation

no code implementations20 Sep 2019 Haiqin Yang

Chinese word segmentation (CWS) is a fundamental task for Chinese language understanding.

Chinese Word Segmentation

HiGRU: Hierarchical Gated Recurrent Units for Utterance-level Emotion Recognition

1 code implementation NAACL 2019 Wenxiang Jiao, Haiqin Yang, Irwin King, Michael R. Lyu

In this paper, we address three challenges in utterance-level emotion recognition in dialogue systems: (1) the same word can deliver different emotions in different contexts; (2) some emotions are rarely seen in general dialogues; (3) long-range contextual information is hard to be effectively captured.

Emotion Recognition

BT-Nets: Simplifying Deep Neural Networks via Block Term Decomposition

no code implementations15 Dec 2017 Guangxi Li, Jinmian Ye, Haiqin Yang, Di Chen, Shuicheng Yan, Zenglin Xu

Recently, deep neural networks (DNNs) have been regarded as the state-of-the-art classification methods in a wide range of applications, especially in image classification.

General Classification Image Classification

A deep learning approach for predicting the quality of online health expert question-answering services

no code implementations21 Dec 2016 Ze Hu, Zhan Zhang, Qing Chen, Haiqin Yang, Decheng Zuo

Finally, a deep belief network (DBN)-based HQA answer quality prediction framework is proposed to predict the quality of answers by learning the high-level hidden semantic representation from the physicians' answers.

Question Answering

Efficient Non-oblivious Randomized Reduction for Risk Minimization with Improved Excess Risk Guarantee

no code implementations6 Dec 2016 Yi Xu, Haiqin Yang, Lijun Zhang, Tianbao Yang

Previously, oblivious random projection based approaches that project high dimensional features onto a random subspace have been used in practice for tackling high-dimensionality challenge in machine learning.

Cannot find the paper you are looking for? You can Submit a new open access paper.