Search Results for author: Chia-Chien Hung

Found 9 papers, 8 papers with code

DS-TOD: Efficient Domain Specialization for Task-Oriented Dialog

1 code implementation Findings (ACL) 2022 Chia-Chien Hung, Anne Lauscher, Simone Ponzetto, Goran Glavaš

Recent work has shown that self-supervised dialog-specific pretraining on large conversational datasets yields substantial gains over traditional language modeling (LM) pretraining in downstream task-oriented dialog (TOD).

dialog state tracking Language Modelling +2

Walking a Tightrope -- Evaluating Large Language Models in High-Risk Domains

no code implementations25 Nov 2023 Chia-Chien Hung, Wiem Ben Rim, Lindsay Frost, Lars Bruckner, Carolin Lawrence

High-risk domains pose unique challenges that require language models to provide accurate and safe responses.

Question Answering

Linking Surface Facts to Large-Scale Knowledge Graphs

1 code implementation23 Oct 2023 Gorjan Radevski, Kiril Gashteovski, Chia-Chien Hung, Carolin Lawrence, Goran Glavaš

Open Information Extraction (OIE) methods extract facts from natural language text in the form of ("subject"; "relation"; "object") triples.

Knowledge Graphs Open Information Extraction

TADA: Efficient Task-Agnostic Domain Adaptation for Transformers

1 code implementation22 May 2023 Chia-Chien Hung, Lukas Lange, Jannik Strötgen

Our broad evaluation in 4 downstream tasks for 14 domains across single- and multi-domain setups and high- and low-resource scenarios reveals that TADA is an effective and efficient alternative to full domain-adaptive pre-training and adapters for domain adaptation, while not introducing additional parameters or complex training steps.

Domain Adaptation

Can Demographic Factors Improve Text Classification? Revisiting Demographic Adaptation in the Age of Transformers

1 code implementation13 Oct 2022 Chia-Chien Hung, Anne Lauscher, Dirk Hovy, Simone Paolo Ponzetto, Goran Glavaš

Previous work showed that incorporating demographic factors can consistently improve performance for various NLP tasks with traditional NLP models.

Language Modelling Multi-Task Learning +2

On the Limitations of Sociodemographic Adaptation with Transformers

1 code implementation1 Aug 2022 Chia-Chien Hung, Anne Lauscher, Dirk Hovy, Simone Paolo Ponzetto, Goran Glavaš

We adapt the language representations for the sociodemographic dimensions of gender and age, using continuous language modeling and dynamic multi-task learning for adaptation, where we couple language modeling with the prediction of a sociodemographic class.

Language Modelling Multi-Task Learning

Multi2WOZ: A Robust Multilingual Dataset and Conversational Pretraining for Task-Oriented Dialog

1 code implementation NAACL 2022 Chia-Chien Hung, Anne Lauscher, Ivan Vulić, Simone Paolo Ponzetto, Goran Glavaš

We then introduce a new framework for multilingual conversational specialization of pretrained language models (PrLMs) that aims to facilitate cross-lingual transfer for arbitrary downstream TOD tasks.

Cross-Lingual Transfer dialog state tracking +1

DS-TOD: Efficient Domain Specialization for Task Oriented Dialog

1 code implementation15 Oct 2021 Chia-Chien Hung, Anne Lauscher, Simone Paolo Ponzetto, Goran Glavaš

Recent work has shown that self-supervised dialog-specific pretraining on large conversational datasets yields substantial gains over traditional language modeling (LM) pretraining in downstream task-oriented dialog (TOD).

dialog state tracking Language Modelling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.