no code implementations • 20 Feb 2024 • Dongjin Kang, Sunghwan Kim, Taeyoon Kwon, Seungjun Moon, Hyunsouk Cho, Youngjae Yu, Dongha Lee, Jinyoung Yeo
Motivated by these, we explore the impact of the inherent preference in LLMs on providing emotional support, and consequently, we observe that exhibiting high preference for specific strategies hinders effective emotional support, aggravating its robustness in predicting the appropriate strategy.
1 code implementation • 24 Sep 2023 • Sekeun Kim, Kyungsang Kim, Jiang Hu, Cheng Chen, Zhiliang Lyu, Ren Hui, Sunghwan Kim, Zhengliang Liu, Aoxiao Zhong, Xiang Li, Tianming Liu, Quanzheng Li
The Segmentation Anything Model (SAM) has gained significant attention for its robust generalization capabilities across diverse downstream tasks.
no code implementations • 16 Aug 2023 • Seong-Joon Park, Hee-Youl Kwak, Sang-Hyo Kim, Sunghwan Kim, Yongjune Kim, Jong-Seon No
In communication and storage systems, error correction codes (ECCs) are pivotal in ensuring data reliability.
1 code implementation • ICCV 2023 • Sunghwan Kim, Dae-hwan Kim, Hoseong Kim
TLDR includes two novel losses to effectively enhance texture learning in DGSS: (1) a texture regularization loss to prevent overfitting to source domain textures by using texture features from an ImageNet pre-trained model and (2) a texture generalization loss that utilizes random style images to learn diverse texture representations in a self-supervised manner.
no code implementations • IEEE Access 2020 • ZAIWAR ALI, SADIA KHAF, ZIAUL HAQ ABBAS, GHULAM ABBAS, FAZAL MUHAMMAD, Sunghwan Kim
We compare our proposed algorithms with other resource allocation approaches and show that our approach can handle the dynamic load conditions better.
no code implementations • 17 Jan 2019 • Jay H. Park, Sunghwan Kim, Jinwon Lee, Myeongjae Jeon, Sam H. Noh
Through analysis of the characteristics of CNN, we find that placement of layers can be done in an effective manner.
Distributed, Parallel, and Cluster Computing