Search Results for author: Yung-Sung Chuang

Found 17 papers, 10 papers with code

Meta-learning for downstream aware and agnostic pretraining

no code implementations6 Jun 2021 Hongyin Luo, Shuyan Dong, Yung-Sung Chuang, Shang-Wen Li

Neural network pretraining is gaining attention due to its outstanding performance in natural language processing applications.

Meta-Learning

Investigating the Reordering Capability in CTC-based Non-Autoregressive End-to-End Speech Translation

1 code implementation Findings (ACL) 2021 Shun-Po Chuang, Yung-Sung Chuang, Chih-Chiang Chang, Hung-Yi Lee

We study the possibilities of building a non-autoregressive speech-to-text translation model using connectionist temporal classification (CTC), and use CTC-based automatic speech recognition as an auxiliary task to improve the performance.

Automatic Speech Recognition Speech-to-Text Translation +1

Semi-Supervised Spoken Language Understanding via Self-Supervised Speech and Language Model Pretraining

1 code implementation26 Oct 2020 Cheng-I Lai, Yung-Sung Chuang, Hung-Yi Lee, Shang-Wen Li, James Glass

Much recent work on Spoken Language Understanding (SLU) is limited in at least one of three ways: models were trained on oracle text input and neglected ASR errors, models were trained to predict only intents without the slot values, or models were trained on a large amount of in-house data.

Language Modelling Spoken Language Understanding

What makes multilingual BERT multilingual?

no code implementations20 Oct 2020 Chi-Liang Liu, Tsung-Yuan Hsu, Yung-Sung Chuang, Hung-Yi Lee

Recently, multilingual BERT works remarkably well on cross-lingual transfer tasks, superior to static non-contextualized word embeddings.

Cross-Lingual Transfer Word Embeddings

Dual Inference for Improving Language Understanding and Generation

1 code implementation Findings of the Association for Computational Linguistics 2020 Shang-Yu Su, Yung-Sung Chuang, Yun-Nung Chen

Natural language understanding (NLU) and Natural language generation (NLG) tasks hold a strong dual relationship, where NLU aims at predicting semantic labels based on natural language utterances and NLG does the opposite.

Natural Language Understanding Text Generation

Lifelong Language Knowledge Distillation

1 code implementation EMNLP 2020 Yung-Sung Chuang, Shang-Yu Su, Yun-Nung Chen

It is challenging to perform lifelong language learning (LLL) on a stream of different tasks without any performance degradation comparing to the multi-task counterparts.

Knowledge Distillation Language Modelling +2

A Study of Cross-Lingual Ability and Language-specific Information in Multilingual BERT

no code implementations20 Apr 2020 Chi-Liang Liu, Tsung-Yuan Hsu, Yung-Sung Chuang, Hung-Yi Lee

Recently, multilingual BERT works remarkably well on cross-lingual transfer tasks, superior to static non-contextualized word embeddings.

Cross-Lingual Transfer Translation +1

SpeechBERT: An Audio-and-text Jointly Learned Language Model for End-to-end Spoken Question Answering

no code implementations25 Oct 2019 Yung-Sung Chuang, Chi-Liang Liu, Hung-Yi Lee, Lin-shan Lee

In addition to the potential of end-to-end SQA, the SpeechBERT can also be considered for many other spoken language understanding tasks just as BERT for many text processing tasks.

Question Answering Speech Recognition +1

Robust Chinese Word Segmentation with Contextualized Word Representations

no code implementations17 Jan 2019 Yung-Sung Chuang

In recent years, after the neural-network-based method was proposed, the accuracy of the Chinese word segmentation task has made great progress.

Chinese Word Segmentation Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.