no code implementations • EMNLP (ACL) 2021 • San-Hee Park, Kang-Min Kim, Seonhee Cho, Jun-Hyung Park, Hyuntae Park, Hyuna Kim, Seongwon Chung, SangKeun Lee
Warning: This manuscript contains a certain level of offensive expression.
1 code implementation • Findings (ACL) 2022 • Yong-Ho Jung, Jun-Hyung Park, Joon-Young Choi, Mingyu Lee, Junho Kim, Kang-Min Kim, SangKeun Lee
Commonsense inference poses a unique challenge to reason and generate the physical, social, and causal conditions of a given event.
1 code implementation • 17 Mar 2023 • Jun-Hyung Park, Yeachan Kim, Junho Kim, Joon-Young Choi, SangKeun Lee
In this work, we introduce a novel structure pruning method, termed as dynamic structure pruning, to identify optimal pruning granularities for intra-channel pruning.
1 code implementation • 15 Dec 2022 • Mingyu Lee, Jun-Hyung Park, Junho Kim, Kang-Min Kim, SangKeun Lee
Masked language modeling (MLM) has been widely used for pre-training effective bidirectional representations, but incurs substantial training costs.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Kang-Min Kim, Bumsu Hyeon, Yeachan Kim, Jun-Hyung Park, SangKeun Lee
In addition, we propose a weakly supervised pretraining, where labels for text classification are obtained automatically from an existing approach.
no code implementations • NeurIPS 2020 • Jun-Hyung Park, Krikamol Muandet
We present an operator-free, measure-theoretic approach to the conditional mean embedding (CME) as a random variable taking values in a reproducing kernel Hilbert space.
no code implementations • NAACL 2019 • Byung-Ju Choi, Jun-Hyung Park, SangKeun Lee
We show the efficacy of our approach in existing CNNs based on the performance evaluation.