no code implementations • 15 Sep 2024 • Jeonglyul Oh, Sungzoon Cho
Recency bias in a sequential recommendation system refers to the overly high emphasis placed on recent items within a user session.
1 code implementation • 9 Jul 2024 • Jinseok Kim, Jaewon Jung, Sangyeop Kim, Sohyung Park, Sungzoon Cho
This paper investigates the potential of sentence encoders to distinguish safe from unsafe prompts, and the ability to classify various unsafe prompts according to a safety taxonomy.
1 code implementation • 14 Feb 2024 • Yeongjae Cho, Taehee Kim, Heejun Shin, Sungzoon Cho, Dongmyung Shin
The model is developed using a step-by-step approach, starting with being pretrained on natural images and texts, followed by being trained using longitudinal chest X-ray data.
1 code implementation • NeurIPS 2023 • Junho Song, Keonwoo Kim, Jeonglyul Oh, Sungzoon Cho
It is designed to incorporate a novel memory module that can learn the degree to which each memory item should be updated in response to the input data.
no code implementations • 14 Jun 2021 • Sung Hwan Jeon, Sungzoon Cho
More precise named entity normalization in text mining will benefit other subsequent text analytic applications.
1 code implementation • 10 Jun 2021 • JiHye Park, Hye Jin Lee, Sungzoon Cho
Increasing attention has been drawn to the sentiment analysis of financial documents.
no code implementations • 1 Jan 2021 • Jinwon An, Misuk Kim, Sungzoon Cho, Junseong Bang
We propose a model for multi-domain dialogue state tracking that effectively models the relationship among domain-slot pairs using a pre-trained language encoder.
no code implementations • 1 Nov 2018 • Jinwon An, Sungwon Lyu, Sungzoon Cho
This paper proposes an attention module augmented relational network called SARN(Sequential Attention Relational Network) that can carry out relational reasoning by extracting reference objects and making efficient pairing between objects.
no code implementations • 28 Apr 2018 • Minki Chung, Sungzoon Cho
To overcome the poor scalability of convolutional neural network, recurrent attention model(RAM) selectively choose what and where to look on the image.
no code implementations • ICLR 2018 • Jihyung Moon, Hyochang Yang, Sungzoon Cho
To solve the text-based question and answering task that requires relational reasoning, it is necessary to memorize a large amount of information and find out the question relevant information from the memory.
Ranked #4 on Question Answering on bAbi (Mean Error Rate metric)
no code implementations • 18 Dec 2017 • Dohyung Kim, Hyochang Yang, Minki Chung, Sungzoon Cho
In this paper, we propose Squeezed Convolutional Variational AutoEncoder (SCVAE) for anomaly detection in time series data for Edge Computing in Industrial Internet of Things (IIoT).
no code implementations • 6 Dec 2017 • Jinbae Im, Sungzoon Cho
Motivated by the Transformer, Directional Self Attention Network (Shen et al., 2017), a fully attention-based sentence encoder, was proposed.
Ranked #54 on Natural Language Inference on SNLI
2 code implementations • Special Lecture on IE, SNU Data Mining Center 2015 • Jinwon An, Sungzoon Cho
We propose an anomaly detection method using the reconstruction probability from the variational autoencoder.