no code implementations • 13 Aug 2024 • Haohan Yuan, Siu Cheung Hui, Haopeng Zhang
In this paper, we propose an event structure-aware generative model named GenBEE, which can capture complex event structures in biomedical text for biomedical event extraction.
no code implementations • 16 Mar 2023 • Haimiao Mo, Shuai Ding, Siu Cheung Hui
Multimodal data can provide more objective evidence for anxiety screening to improve the accuracy of models.
no code implementations • 16 Feb 2023 • Jing Xu, Dandan song, Chong Liu, Siu Cheung Hui, Fei Li, Qiang Ju, Xiaonan He, Jian Xie
In this paper, we propose a Dialogue State Distillation Network (DSDN) to utilize relevant information of previous dialogue states and migrate the gap of utilization between training and testing.
no code implementations • 1 Nov 2022 • Anran Hao, Siu Cheung Hui, Jian Su
Event Detection, which aims to identify and classify mentions of event instances from unstructured articles, is an important task in Natural Language Processing (NLP).
no code implementations • 15 Jan 2022 • Tan Khang Le, Siu Cheung Hui
In this paper, we use different deep learning approaches to address the problems of sentiment analysis, automatic review tag generation, and retrieval of food reviews.
no code implementations • ACL 2021 • Fei Li, Zheng Wang, Siu Cheung Hui, Lejian Liao, Dandan song, Jing Xu, Guoxiu He, Meihuizi Jia
Although the existing Named Entity Recognition (NER) models have achieved promising performance, they suffer from certain drawbacks.
3 code implementations • 17 Feb 2021 • Aston Zhang, Yi Tay, Shuai Zhang, Alvin Chan, Anh Tuan Luu, Siu Cheung Hui, Jie Fu
Recent works have demonstrated reasonable success of representation learning in hypercomplex space.
no code implementations • NeurIPS 2019 • Yi Tay, Anh Tuan Luu, Aston Zhang, Shuohang Wang, Siu Cheung Hui
Attentional models are distinctly characterized by their ability to learn relative importance, i. e., assigning a different weight to input values.
no code implementations • 25 Sep 2019 • Yi Tay, Aston Zhang, Shuai Zhang, Alvin Chan, Luu Anh Tuan, Siu Cheung Hui
We propose R2D2 layers, a new neural block for training efficient NLP models.
1 code implementation • AAAI 2019 • Yi Tay, Shuai Zhang, Anh Tuan Luu, Siu Cheung Hui, Lina Yao, Tran Dang Quang Vinh
Factorization Machines (FMs) are a class of popular algorithms that have been widely adopted for collaborative filtering and recommendation tasks.
no code implementations • 28 Jun 2019 • Ning Wang, Xiaokui Xiao, Yin Yang, Jun Zhao, Siu Cheung Hui, Hyejin Shin, Junbum Shin, Ge Yu
Motivated by this, we first propose novel LDP mechanisms for collecting a numeric attribute, whose accuracy is at least no worse (and usually better) than existing solutions in terms of worst-case noise variance.
1 code implementation • ACL 2019 • Yi Tay, Aston Zhang, Luu Anh Tuan, Jinfeng Rao, Shuai Zhang, Shuohang Wang, Jie Fu, Siu Cheung Hui
Many state-of-the-art neural models for NLP are heavily parameterized and thus memory inefficient.
no code implementations • ACL 2019 • Yi Tay, Shuohang Wang, Luu Anh Tuan, Jie Fu, Minh C. Phan, Xingdi Yuan, Jinfeng Rao, Siu Cheung Hui, Aston Zhang
This paper tackles the problem of reading comprehension over long narratives where documents easily span over thousands of tokens.
1 code implementation • NeurIPS 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
Recurrent neural networks (RNNs) such as long short-term memory and gated recurrent units are pivotal building blocks across a broad spectrum of sequence modeling problems.
2 code implementations • NeurIPS 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui, Jian Su
Secondly, the dense connectors in our network are learned via attention instead of standard residual skip-connectors.
Ranked #2 on Question Answering on NewsQA
no code implementations • EMNLP 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
This task enables many potential applications such as question answering and paraphrase identification.
no code implementations • EMNLP 2018 • Yi Tay, Anh Tuan Luu, Siu Cheung Hui
Sequence encoders are crucial components in many neural architectures for learning to read and comprehend.
Ranked #7 on Question Answering on NarrativeQA
no code implementations • EMNLP 2018 • Yi Tay, Anh Tuan Luu, Siu Cheung Hui, Jian Su
This paper proposes a new neural architecture that exploits readily available sentiment lexicon resources.
no code implementations • 17 Jun 2018 • Yi Tay, Shuai Zhang, Luu Anh Tuan, Siu Cheung Hui
This paper has been withdrawn as we discovered a bug in our tensorflow implementation that involved accidental mixing of vectors across batches.
no code implementations • 3 Jun 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
Attention is typically used to select informative sub-phrases that are used for prediction.
no code implementations • 29 May 2018 • Yi Tay, Anh Tuan Luu, Siu Cheung Hui
Our approach, the CoupleNet is an end-to-end deep learning based estimator that analyzes the social profiles of two users and subsequently performs a similarity match between the users.
no code implementations • ACL 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui, Jian Su
Sarcasm is a sophisticated speech act which commonly manifests on social communities such as Twitter and Reddit.
no code implementations • 24 Mar 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
Similarly, we achieve competitive performance relative to AMANDA on the SearchQA benchmark and BiDAF on the NarrativeQA benchmark without using any LSTM/GRU layers.
Ranked #5 on Question Answering on RACE
2 code implementations • 28 Jan 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
Our model operates on a multi-hierarchical paradigm and is based on the intuition that not all reviews are created equal, i. e., only a select few are important.
no code implementations • EMNLP 2018 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
Firstly, we introduce a new architecture where alignment pairs are compared, compressed and then propagated to upper layers for enhanced representation learning.
Ranked #7 on Natural Language Inference on SciTail
no code implementations • 14 Dec 2017 • Yi Tay, Anh Tuan Luu, Siu Cheung Hui
Our novel model, \textit{Aspect Fusion LSTM} (AF-LSTM) learns to attend based on associative relationships between sentence words and aspect which allows our model to adaptively focus on the correct words given an aspect term.
Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA) +1
1 code implementation • 21 Nov 2017 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
This paper explores the idea of learning temporal gates for sequence pairs (question and answer), jointly influencing the learned representations in a pairwise manner.
1 code implementation • 14 Nov 2017 • Yi Tay, Minh C. Phan, Luu Anh Tuan, Siu Cheung Hui
Our new method proposes a new \textsc{SkipFlow} mechanism that models relationships between snapshots of the hidden representations of a long short-term memory (LSTM) network as it reads.
Ranked #5 on Automated Essay Scoring on ASAP-AES
no code implementations • 24 Aug 2017 • Thông T. Nguyên, Siu Cheung Hui
However, for sensitive survival data such as medical data, there are serious concerns about the privacy of individuals in the data set when medical data is used to fit the regression models.
no code implementations • 16 Aug 2017 • Yi Tay, Luu Anh Tuan, Minh C. Phan, Siu Cheung Hui
Unfortunately, many state-of-the-art relational learning models ignore this information due to the challenging nature of dealing with non-discrete data types in the inherently binary-natured knowledge graphs.
1 code implementation • 25 Jul 2017 • Yi Tay, Luu Anh Tuan, Siu Cheung Hui
The dominant neural architectures in question answer retrieval are based on recurrent or convolutional encoders configured with complex word matching layers.
Ranked #1 on Question Answering on SemEvalCQA
1 code implementation • 17 Jul 2017 • Yi Tay, Anh Tuan Luu, Siu Cheung Hui
Our model, LRML (\textit{Latent Relational Metric Learning}) is a novel metric learning approach for recommendation.
Ranked #1 on Recommendation Systems on Netflix (nDCG@10 metric)
no code implementations • 16 Jun 2016 • Thông T. Nguyên, Xiaokui Xiao, Yin Yang, Siu Cheung Hui, Hyejin Shin, Junbum Shin
Organizations with a large user base, such as Samsung and Google, can potentially benefit from collecting and mining users' data.
Databases
no code implementations • TACL 2016 • Luu Anh Tuan, Siu Cheung Hui, See Kiong Ng
Taxonomies play an important role in many applications by organizing domain knowledge into a hierarchy of {`}is-a{'} relations between terms.