Search Results for author: Junghyun Min

Found 4 papers, 2 papers with code

Structured Language Generation Model for Robust Structure Prediction

no code implementations14 Feb 2024 Minho Lee, Junghyun Min, Woochul Lee, Yeonsoo Lee

Previous work in structured prediction (e. g. NER, information extraction) using single model make use of explicit dataset information, which helps boost in-distribution performance but is orthogonal to robust generalization in real-world situations.

NER Structured Prediction +1

Punctuation Restoration Improves Structure Understanding without Supervision

no code implementations13 Feb 2024 Junghyun Min, Minho Lee, Woochul Lee, Yeonsoo Lee

Unsupervised learning objectives like language modeling and de-noising constitute a significant part in producing pre-trained models that perform various downstream applications from natural language understanding to conversational tasks.

Chunking Language Modelling +7

Syntactic Data Augmentation Increases Robustness to Inference Heuristics

1 code implementation ACL 2020 Junghyun Min, R. Thomas McCoy, Dipanjan Das, Emily Pitler, Tal Linzen

Pretrained neural models such as BERT, when fine-tuned to perform natural language inference (NLI), often show high accuracy on standard datasets, but display a surprising lack of sensitivity to word order on controlled challenge sets.

Data Augmentation Natural Language Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.