Search Results for author: Hyeji Kim

Found 23 papers, 13 papers with code

LIGHTCODE: Light Analytical and Neural Codes for Channels with Feedback

no code implementations16 Mar 2024 Sravan Kumar Ankireddy, Krishna Narayanan, Hyeji Kim

First, we demonstrate that POWERBLAST, an analytical coding scheme inspired by Schalkwijk-Kailath (SK) and Gallager-Nakiboglu (GN) schemes, achieves notable reliability improvements over both SK and GN schemes, outperforming neural codes in high signal-to-noise ratio (SNR) regions.

DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning

no code implementations14 Feb 2024 S Ashwin Hebbar, Sravan Kumar Ankireddy, Hyeji Kim, Sewoong Oh, Pramod Viswanath

Polar codes, developed on the foundation of Arikan's polarization kernel, represent a breakthrough in coding theory and have emerged as the state-of-the-art error-correction-code in short-to-medium block length regimes.

Attention with Markov: A Framework for Principled Analysis of Transformers via Markov Chains

1 code implementation6 Feb 2024 Ashok Vardhan Makkuva, Marco Bondaschi, Adway Girish, Alliot Nagle, Martin Jaggi, Hyeji Kim, Michael Gastpar

Inspired by the Markovianity of natural languages, we model the data as a Markovian source and utilize this framework to systematically study the interplay between the data-distributional properties, the transformer architecture, the learnt distribution, and the final model performance.

TinyTurbo: Efficient Turbo Decoders on Edge

1 code implementation30 Sep 2022 S Ashwin Hebbar, Rajesh K Mishra, Sravan Kumar Ankireddy, Ashok V Makkuva, Hyeji Kim, Pramod Viswanath

In this paper, we introduce a neural-augmented decoder for Turbo codes called TINYTURBO .

Interpreting Neural Min-Sum Decoders

1 code implementation21 May 2022 Sravan Kumar Ankireddy, Hyeji Kim

In decoding linear block codes, it was shown that noticeable reliability gains can be achieved by introducing learnable parameters to the Belief Propagation (BP) decoder.

DeepIC: Coding for Interference Channels via Deep Learning

no code implementations13 Aug 2021 Karl Chahine, Nanyang Ye, Hyeji Kim

Interestingly, it is shown that there exists an asymptotic scheme, called Han-Kobayashi scheme, that performs better than TD and TIN.

A Channel Coding Benchmark for Meta-Learning

1 code implementation15 Jul 2021 Rui Li, Ondrej Bohdal, Rajesh Mishra, Hyeji Kim, Da Li, Nicholas Lane, Timothy Hospedales

We use our MetaCC benchmark to study several aspects of meta-learning, including the impact of task distribution breadth and shift, which can be controlled in the coding problem.

Meta-Learning

Neural Distributed Source Coding

no code implementations5 Jun 2021 Jay Whang, Alliot Nagle, Anish Acharya, Hyeji Kim, Alexandros G. Dimakis

Distributed source coding (DSC) is the task of encoding an input in the absence of correlated side information that is only available to the decoder.

Deepcode and Modulo-SK are Designed for Different Settings

no code implementations18 Aug 2020 Hyeji Kim, Yihan Jiang, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

DeepCode is designed and evaluated for the AWGN channel with (potentially delayed) uncoded output feedback.

HAPI: Hardware-Aware Progressive Inference

no code implementations10 Aug 2020 Stefanos Laskaridis, Stylianos I. Venieris, Hyeji Kim, Nicholas D. Lane

Convolutional neural networks (CNNs) have recently become the state-of-the-art in a diversity of AI tasks.

BRP-NAS: Prediction-based NAS using GCNs

2 code implementations NeurIPS 2020 Łukasz Dudziak, Thomas Chau, Mohamed S. Abdelfattah, Royson Lee, Hyeji Kim, Nicholas D. Lane

What is more, we investigate prediction quality on different metrics and show that sample efficiency of the predictor-based NAS can be improved by considering binary relations of models and an iterative data selection strategy.

Neural Architecture Search

Journey Towards Tiny Perceptual Super-Resolution

2 code implementations ECCV 2020 Royson Lee, Łukasz Dudziak, Mohamed Abdelfattah, Stylianos I. Venieris, Hyeji Kim, Hongkai Wen, Nicholas D. Lane

Recent works in single-image perceptual super-resolution (SR) have demonstrated unprecedented performance in generating realistic textures by means of deep convolutional networks.

Neural Architecture Search Super-Resolution

Best of Both Worlds: AutoML Codesign of a CNN and its Hardware Accelerator

no code implementations11 Feb 2020 Mohamed S. Abdelfattah, Łukasz Dudziak, Thomas Chau, Royson Lee, Hyeji Kim, Nicholas D. Lane

We automate HW-CNN codesign using NAS by including parameters from both the CNN model and the HW accelerator, and we jointly search for the best model-accelerator pair that boosts accuracy and efficiency.

General Classification Image Classification +2

Turbo Autoencoder: Deep learning based channel codes for point-to-point communication channels

1 code implementation NeurIPS 2019 Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

Designing codes that combat the noise in a communication medium has remained a significant area of research in information theory as well as wireless communications.

DeepTurbo: Deep Turbo Decoder

1 code implementation6 Mar 2019 Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

We focus on Turbo codes and propose DeepTurbo, a novel deep learning based architecture for Turbo decoding.

LEARN Codes: Inventing Low-latency Codes via Recurrent Neural Networks

1 code implementation30 Nov 2018 Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

Designing channel codes under low-latency constraints is one of the most demanding requirements in 5G standards.

Efficient Neural Network Compression

1 code implementation CVPR 2019 Hyeji Kim, Muhammad Umar Karim Khan, Chong-Min Kyung

The better accuracy and complexity compromise, as well as the extremely fast speed of our method makes it suitable for neural network compression.

Efficient Neural Network Neural Network Compression

Deepcode: Feedback Codes via Deep Learning

1 code implementation NeurIPS 2018 Hyeji Kim, Yihan Jiang, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

The design of codes for communicating reliably over a statistically well defined channel is an important endeavor involving deep mathematical research and wide-ranging practical applications.

Automatic Rank Selection for High-Speed Convolutional Neural Network

no code implementations28 Jun 2018 Hyeji Kim, Chong-Min Kyung

In this paper, we define rank selection as a combinatorial optimization problem and propose a methodology to minimize network complexity while maintaining the desired accuracy.

Combinatorial Optimization Vocal Bursts Intensity Prediction

Communication Algorithms via Deep Learning

3 code implementations ICLR 2018 Hyeji Kim, Yihan Jiang, Ranvir Rana, Sreeram Kannan, Sewoong Oh, Pramod Viswanath

We show that creatively designed and trained RNN architectures can decode well known sequential codes such as the convolutional and turbo codes with close to optimal performance on the additive white Gaussian noise (AWGN) channel, which itself is achieved by breakthrough algorithms of our times (Viterbi and BCJR decoders, representing dynamic programing and forward-backward algorithms).

Cannot find the paper you are looking for? You can Submit a new open access paper.