Search Results for author: Shih-Chieh Chang

Found 12 papers, 3 papers with code

Complement Objective Training

1 code implementation ICLR 2019 Hao-Yun Chen, Pei-Hsin Wang, Chun-Hao Liu, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, Da-Cheng Juan

Although being a widely-adopted approach, using cross entropy as the primary objective exploits mostly the information from the ground-truth class for maximizing data likelihood, and largely ignores information from the complement (incorrect) classes.

Natural Language Understanding

Improving Adversarial Robustness via Guided Complement Entropy

2 code implementations ICCV 2019 Hao-Yun Chen, Jhao-Hong Liang, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, Da-Cheng Juan

Adversarial robustness has emerged as an important topic in deep learning as carefully crafted attack samples can significantly disturb the performance of a model.

Adversarial Defense Adversarial Robustness

AirConcierge: Generating Task-Oriented Dialogue via Efficient Large-Scale Knowledge Retrieval

1 code implementation Findings of the Association for Computational Linguistics 2020 Chieh-Yang Chen, Pei-Hsin Wang, Shih-Chieh Chang, Da-Cheng Juan, Wei Wei, Jia-Yu Pan

Despite recent success in neural task-oriented dialogue systems, developing such a real-world system involves accessing large-scale knowledge bases (KBs), which cannot be simply encoded by neural approaches, such as memory network mechanisms.

Retrieval Task-Oriented Dialogue Systems +1

Searching Toward Pareto-Optimal Device-Aware Neural Architectures

no code implementations29 Aug 2018 An-Chieh Cheng, Jin-Dong Dong, Chi-Hung Hsu, Shu-Huan Chang, Min Sun, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, Da-Cheng Juan

Recent breakthroughs in Neural Architectural Search (NAS) have achieved state-of-the-art performance in many tasks such as image classification and language understanding.

Image Classification

Dynamic Early Terminating of Multiply Accumulate Operations for Saving Computation Cost in Convolutional Neural Networks

no code implementations ICLR 2019 Yu-Yi Su, Yung-Chih Chen, Xiang-Xiu Wu, Shih-Chieh Chang

We propose to set a checkpoint in the MAC process to determine whether a filter could terminate early based on the intermediate result.

Learning with Hierarchical Complement Objective

no code implementations17 Nov 2019 Hao-Yun Chen, Li-Huang Tsai, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, Da-Cheng Juan

Label hierarchies widely exist in many vision-related problems, ranging from explicit label hierarchies existed in image classification to latent label hierarchies existed in semantic segmentation.

General Classification Image Classification +2

Robust Processing-In-Memory Neural Networks via Noise-Aware Normalization

no code implementations7 Jul 2020 Li-Huang Tsai, Shih-Chieh Chang, Yu-Ting Chen, Jia-Yu Pan, Wei Wei, Da-Cheng Juan

In this paper, we propose a noise-agnostic method to achieve robust neural network performance against any noise setting.

object-detection Object Detection +1

Remix: Rebalanced Mixup

no code implementations8 Jul 2020 Hsin-Ping Chou, Shih-Chieh Chang, Jia-Yu Pan, Wei Wei, Da-Cheng Juan

In this work, we propose a new regularization technique, Remix, that relaxes Mixup's formulation and enables the mixing factors of features and labels to be disentangled.

Context-Aware Temperature for Language Modeling

no code implementations1 Jan 2021 Pei-Hsin Wang, Sheng-Iou Hsieh, Shih-Chieh Chang, Yu-Ting Chen, Da-Cheng Juan, Jia-Yu Pan, Wei Wei

Current practices to apply temperature scaling assume either a fixed, or a manually-crafted dynamically changing schedule.

Language Modelling

Contextual Temperature for Language Modeling

no code implementations25 Dec 2020 Pei-Hsin Wang, Sheng-Iou Hsieh, Shih-Chieh Chang, Yu-Ting Chen, Jia-Yu Pan, Wei Wei, Da-Chang Juan

Temperature scaling has been widely used as an effective approach to control the smoothness of a distribution, which helps the model performance in various tasks.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.