Search Results for author: Ruolin Su

Found 7 papers, 4 papers with code

A Label-Aware BERT Attention Network for Zero-Shot Multi-Intent Detection in Spoken Language Understanding

1 code implementation EMNLP 2021 Ting-Wei Wu, Ruolin Su, Biing Juang

We show that it successfully extends to few/zero-shot setting where part of intent labels are unseen in training data, by also taking account of semantics in these unseen intent labels.

Intent Detection Spoken Language Understanding

Schema Graph-Guided Prompt for Multi-Domain Dialogue State Tracking

no code implementations10 Nov 2023 Ruolin Su, Ting-Wei Wu, Biing-Hwang Juang

Tracking dialogue states is an essential topic in task-oriented dialogue systems, which involve filling in the necessary information in pre-defined slots corresponding to a schema.

Dialogue State Tracking Language Modelling +4

CLICKER: Attention-Based Cross-Lingual Commonsense Knowledge Transfer

no code implementations26 Feb 2023 Ruolin Su, Zhongkai Sun, Sixing Lu, Chengyuan Ma, Chenlei Guo

Recent advances in cross-lingual commonsense reasoning (CSR) are facilitated by the development of multilingual pre-trained models (mPTMs).

Question Answering Transfer Learning

Choice Fusion as Knowledge for Zero-Shot Dialogue State Tracking

1 code implementation25 Feb 2023 Ruolin Su, Jingfeng Yang, Ting-Wei Wu, Biing-Hwang Juang

With the demanding need for deploying dialogue systems in new domains with less cost, zero-shot dialogue state tracking (DST), which tracks user's requirements in task-oriented dialogues without training on desired domains, draws attention increasingly.

Dialogue State Tracking Language Modelling +2

Act-Aware Slot-Value Predicting in Multi-Domain Dialogue State Tracking

1 code implementation4 Aug 2022 Ruolin Su, Ting-Wei Wu, Biing-Hwang Juang

As an essential component in task-oriented dialogue systems, dialogue state tracking (DST) aims to track human-machine interactions and generate state representations for managing the dialogue.

Dialogue State Tracking Machine Reading Comprehension +2

Why patient data cannot be easily forgotten?

no code implementations29 Jun 2022 Ruolin Su, Xiao Liu, Sotirios A. Tsaftaris

With the advent of AI learned on data, one can imagine that such rights can extent to requests for forgetting knowledge of patient's data within AI models.

A Context-Aware Hierarchical BERT Fusion Network for Multi-turn Dialog Act Detection

1 code implementation3 Sep 2021 Ting-Wei Wu, Ruolin Su, Biing-Hwang Juang

The success of interactive dialog systems is usually associated with the quality of the spoken language understanding (SLU) task, which mainly identifies the corresponding dialog acts and slot values in each turn.

slot-filling Slot Filling +1

Cannot find the paper you are looking for? You can Submit a new open access paper.