Search Results for author: Minjin Choi

Found 8 papers, 7 papers with code

GLEN: Generative Retrieval via Lexical Index Learning

1 code implementation6 Nov 2023 Sunkyung Lee, Minjin Choi, Jongwuk Lee

For training, GLEN effectively exploits a dynamic lexical identifier using a two-phase index learning strategy, enabling it to learn meaningful lexical identifiers and relevance signals between queries and documents.

Learning-To-Rank Retrieval +1

ConQueR: Contextualized Query Reduction using Search Logs

1 code implementation22 May 2023 Hye-Young Kim, Minjin Choi, Sunkyung Lee, Eunseong Choi, Young-In Song, Jongwuk Lee

One extracts core terms from an original query at the term level, and the other determines whether a sub-query is a suitable reduction for the original query at the sequence level.

Language Modelling Retrieval +1

SpaDE: Improving Sparse Representations using a Dual Document Encoder for First-stage Retrieval

2 code implementations13 Sep 2022 Eunseong Choi, Sunkyung Lee, Minjin Choi, Hyeseon Ko, Young-In Song, Jongwuk Lee

Sparse document representations have been widely used to retrieve relevant documents via exact lexical matching.

Retrieval

S-Walk: Accurate and Scalable Session-based Recommendationwith Random Walks

1 code implementation4 Jan 2022 Minjin Choi, jinhong Kim, Joonsek Lee, Hyunjung Shim, Jongwuk Lee

Session-based recommendation (SR) predicts the next items from a sequence of previous items consumed by an anonymous user.

Computational Efficiency Session-Based Recommendations

Session-aware Linear Item-Item Models for Session-based Recommendation

3 code implementations30 Mar 2021 Minjin Choi, jinhong Kim, Joonseok Lee, Hyunjung Shim, Jongwuk Lee

Session-based recommendation aims at predicting the next item given a sequence of previous items consumed in the session, e. g., on e-commerce or multimedia streaming services.

Session-Based Recommendations

Local Collaborative Autoencoders

2 code implementations30 Mar 2021 Minjin Choi, Yoonki Jeong, Joonseok Lee, Jongwuk Lee

Top-N recommendation is a challenging problem because complex and sparse user-item interactions should be adequately addressed to achieve high-quality recommendation results.

Collaborative Distillation for Top-N Recommendation

no code implementations13 Nov 2019 Jae-woong Lee, Minjin Choi, Jongwuk Lee, Hyunjung Shim

Knowledge distillation (KD) is a well-known method to reduce inference latency by compressing a cumbersome teacher model to a small student model.

Collaborative Filtering Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.