Search Results for author: Seijoon Kim

Found 8 papers, 0 papers with code

Gradient-based Bit Encoding Optimization for Noise-Robust Binary Memristive Crossbar

no code implementations5 Jan 2022 Youngeun Kim, Hyunsoo Kim, Seijoon Kim, Sang Joon Kim, Priyadarshini Panda

In addition, we propose Gradient-based Bit Encoding Optimization (GBO) which optimizes a different number of pulses at each layer, based on our in-depth analysis that each layer has a different level of noise sensitivity.

T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding

no code implementations26 Mar 2020 Seongsik Park, Seijoon Kim, Byunggook Na, Sungroh Yoon

Spiking neural networks (SNNs) have gained considerable interest due to their energy-efficient characteristics, yet lack of a scalable training algorithm has restricted their applicability in practical machine learning problems.

Spiking-YOLO: Spiking Neural Network for Energy-Efficient Object Detection

no code implementations12 Mar 2019 Seijoon Kim, Seongsik Park, Byunggook Na, Sungroh Yoon

Over the past decade, deep neural networks (DNNs) have demonstrated remarkable performance in a variety of applications.

Image Classification object-detection +1

Fast and Efficient Information Transmission with Burst Spikes in Deep Spiking Neural Networks

no code implementations10 Sep 2018 Seongsik Park, Seijoon Kim, Hyeokjun Choe, Sungroh Yoon

The spiking neural networks (SNNs) are considered as one of the most promising artificial neural networks due to their energy efficient computing capability.

Image Classification

Quantized Memory-Augmented Neural Networks

no code implementations10 Nov 2017 Seongsik Park, Seijoon Kim, Seil Lee, Ho Bae, Sungroh Yoon

In this paper, we identify memory addressing (specifically, content-based addressing) as the main reason for the performance degradation and propose a robust quantization method for MANNs to address the challenge.

Quantization

Near-Data Processing for Differentiable Machine Learning Models

no code implementations6 Oct 2016 Hyeokjun Choe, Seil Lee, Hyunha Nam, Seongsik Park, Seijoon Kim, Eui-Young Chung, Sungroh Yoon

The second is the popularity of NAND flash-based solid-state drives (SSDs) containing multicore processors that can accommodate extra computation for data processing.

BIG-bench Machine Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.