Search Results for author: Hyuk-Jae Lee

Found 8 papers, 3 papers with code

HEAM : Hashed Embedding Acceleration using Processing-In-Memory

no code implementations6 Feb 2024 Youngsuk Kim, Hyuk-Jae Lee, Chae Eun Rhee

This paper introduces HEAM, a heterogeneous memory architecture that integrates 3D-stacked DRAM with DIMM to accelerate recommendation systems in which compositional embedding is utilized-a technique aimed at reducing the size of embedding tables.

Recommendation Systems

EGformer: Equirectangular Geometry-biased Transformer for 360 Depth Estimation

no code implementations ICCV 2023 IlWi Yun, Chanyong Shin, Hyunku Lee, Hyuk-Jae Lee, Chae Eun Rhee

However, to apply local attention successfully for EIs, a specific strategy, which addresses distorted equirectangular geometry and limited receptive field simultaneously, is required.

Depth Estimation

Improving 360 Monocular Depth Estimation via Non-local Dense Prediction Transformer and Joint Supervised and Self-supervised Learning

1 code implementation22 Sep 2021 IlWi Yun, Hyuk-Jae Lee, Chae Eun Rhee

Due to difficulties in acquiring ground truth depth of equirectangular (360) images, the quality and quantity of equirectangular depth data today is insufficient to represent the various scenes in the world.

Monocular Depth Estimation Self-Supervised Learning

Active Learning for Deep Object Detection via Probabilistic Modeling

1 code implementation ICCV 2021 Jiwoong Choi, Ismail Elezi, Hyuk-Jae Lee, Clement Farabet, Jose M. Alvarez

Most of these methods are based on multiple models or are straightforward extensions of classification methods, hence estimate an image's informativeness using only the classification head.

Active Learning Classification +5

GradPIM: A Practical Processing-in-DRAM Architecture for Gradient Descent

no code implementations15 Feb 2021 Heesu Kim, Hanmin Park, Taehyun Kim, Kwanheum Cho, Eojin Lee, Soojung Ryu, Hyuk-Jae Lee, Kiyoung Choi, Jinho Lee

In this paper, we present GradPIM, a processing-in-memory architecture which accelerates parameter updates of deep neural networks training.

Deep Active Learning for Object Detection with Mixture Density Networks

no code implementations1 Jan 2021 Jiwoong Choi, Ismail Elezi, Hyuk-Jae Lee, Clement Farabet, Jose M. Alvarez

For active learning, we propose a scoring function that aggregates uncertainties from both the classification and the localization outputs of the network.

Active Learning Informativeness +3

Cannot find the paper you are looking for? You can Submit a new open access paper.