no code implementations • 7 Feb 2025 • Heasung Kim, Taekyun Lee, Hyeji Kim, Gustavo de Veciana
Importance sampling, which involves sampling from a probability density function (PDF) proportional to the product of an importance weight function and a base PDF, is a powerful technique with applications in variance reduction, biased or customized sampling, data augmentation, and beyond.
1 code implementation • 23 Oct 2024 • Karl Chahine, Hyeji Kim
In steganography, selecting an optimal cover image, referred to as cover selection, is pivotal for effective message concealment.
no code implementations • 5 Sep 2024 • Taekyun Lee, Juseong Park, Hyeji Kim, Jeffrey G. Andrews
Deep neural network (DNN)-based algorithms are emerging as an important tool for many physical and MAC layer functions in future wireless communication systems, including for large multi-antenna channels.
no code implementations • 22 Jul 2024 • Alliot Nagle, Adway Girish, Marco Bondaschi, Michael Gastpar, Ashok Vardhan Makkuva, Hyeji Kim
We extend our experiments to a small natural language dataset to further confirm our findings on our synthetic dataset.
no code implementations • 21 Jul 2024 • Rajesh Mishra, Syed Jafar, Sriram Vishwanath, Hyeji Kim
We propose a novel deep learning-based approach to design the encoder and decoder functions that aim to maximize the sumrate of the interference channel for discrete constellations.
1 code implementation • 5 Jun 2024 • Ashok Vardhan Makkuva, Marco Bondaschi, Chanakya Ekbote, Adway Girish, Alliot Nagle, Hyeji Kim, Michael Gastpar
In this paper, we address this by focusing on first-order Markov chains and single-layer transformers, providing a comprehensive characterization of the learning dynamics in this context.
1 code implementation • 16 Mar 2024 • Sravan Kumar Ankireddy, Krishna Narayanan, Hyeji Kim
The design of reliable and efficient codes for channels with feedback remains a longstanding challenge in communication theory.
1 code implementation • 14 Feb 2024 • S Ashwin Hebbar, Sravan Kumar Ankireddy, Hyeji Kim, Sewoong Oh, Pramod Viswanath
Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic.
1 code implementation • 6 Feb 2024 • Ashok Vardhan Makkuva, Marco Bondaschi, Adway Girish, Alliot Nagle, Martin Jaggi, Hyeji Kim, Michael Gastpar
Inspired by the Markovianity of natural languages, we model the data as a Markovian source and utilize this framework to systematically study the interplay between the data-distributional properties, the transformer architecture, the learnt distribution, and the final model performance.
no code implementations • 19 Oct 2023 • Ashok Vardhan Makkuva, Marco Bondaschi, Thijs Vogels, Martin Jaggi, Hyeji Kim, Michael C. Gastpar
On the latter, we obtain $50$-$64 \%$ improvement in perplexity over our baselines for noisy channels.
1 code implementation • NeurIPS 2023 • Po-han Li, Sravan Kumar Ankireddy, Ruihan Zhao, Hossein Nourkhiz Mahjoub, Ehsan Moradi-Pari, Ufuk Topcu, Sandeep Chinchali, Hyeji Kim
A decoder at the central node decompresses and passes the data to a pre-trained machine learning-based task to generate the final output.
1 code implementation • 30 Sep 2022 • S Ashwin Hebbar, Rajesh K Mishra, Sravan Kumar Ankireddy, Ashok V Makkuva, Hyeji Kim, Pramod Viswanath
In this paper, we introduce a neural-augmented decoder for Turbo codes called TINYTURBO .
1 code implementation • 21 May 2022 • Sravan Kumar Ankireddy, Hyeji Kim
In decoding linear block codes, it was shown that noticeable reliability gains can be achieved by introducing learnable parameters to the Belief Propagation (BP) decoder.
no code implementations • 13 Aug 2021 • Karl Chahine, Nanyang Ye, Hyeji Kim
Interestingly, it is shown that there exists an asymptotic scheme, called Han-Kobayashi scheme, that performs better than TD and TIN.
1 code implementation • 15 Jul 2021 • Rui Li, Ondrej Bohdal, Rajesh Mishra, Hyeji Kim, Da Li, Nicholas Lane, Timothy Hospedales
We use our MetaCC benchmark to study several aspects of meta-learning, including the impact of task distribution breadth and shift, which can be controlled in the coding problem.
1 code implementation • 5 Jun 2021 • Jay Whang, Alliot Nagle, Anish Acharya, Hyeji Kim, Alexandros G. Dimakis
Distributed source coding (DSC) is the task of encoding an input in the absence of correlated side information that is only available to the decoder.
no code implementations • 18 Aug 2020 • Hyeji Kim, Yihan Jiang, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
DeepCode is designed and evaluated for the AWGN channel with (potentially delayed) uncoded output feedback.
no code implementations • 10 Aug 2020 • Stefanos Laskaridis, Stylianos I. Venieris, Hyeji Kim, Nicholas D. Lane
Convolutional neural networks (CNNs) have recently become the state-of-the-art in a diversity of AI tasks.
2 code implementations • NeurIPS 2020 • Łukasz Dudziak, Thomas Chau, Mohamed S. Abdelfattah, Royson Lee, Hyeji Kim, Nicholas D. Lane
What is more, we investigate prediction quality on different metrics and show that sample efficiency of the predictor-based NAS can be improved by considering binary relations of models and an iterative data selection strategy.
2 code implementations • ECCV 2020 • Royson Lee, Łukasz Dudziak, Mohamed Abdelfattah, Stylianos I. Venieris, Hyeji Kim, Hongkai Wen, Nicholas D. Lane
Recent works in single-image perceptual super-resolution (SR) have demonstrated unprecedented performance in generating realistic textures by means of deep convolutional networks.
1 code implementation • 20 Apr 2020 • Jung-Woo Ha, Kihyun Nam, Jingu Kang, Sang-Woo Lee, Sohee Yang, Hyunhoon Jung, Eunmi Kim, Hyeji Kim, Soojin Kim, Hyun Ah Kim, Kyoungtae Doh, Chan Kyu Lee, Nako Sung, Sunghun Kim
Automatic speech recognition (ASR) via call is essential for various applications, including AI for contact center (AICC) services.
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+4
no code implementations • 11 Feb 2020 • Mohamed S. Abdelfattah, Łukasz Dudziak, Thomas Chau, Royson Lee, Hyeji Kim, Nicholas D. Lane
We automate HW-CNN codesign using NAS by including parameters from both the CNN model and the HW accelerator, and we jointly search for the best model-accelerator pair that boosts accuracy and efficiency.
1 code implementation • NeurIPS 2019 • Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
Designing codes that combat the noise in a communication medium has remained a significant area of research in information theory as well as wireless communications.
1 code implementation • 6 Mar 2019 • Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
We focus on Turbo codes and propose DeepTurbo, a novel deep learning based architecture for Turbo decoding.
1 code implementation • 30 Nov 2018 • Yihan Jiang, Hyeji Kim, Himanshu Asnani, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
Designing channel codes under low-latency constraints is one of the most demanding requirements in 5G standards.
1 code implementation • CVPR 2019 • Hyeji Kim, Muhammad Umar Karim Khan, Chong-Min Kyung
The better accuracy and complexity compromise, as well as the extremely fast speed of our method makes it suitable for neural network compression.
1 code implementation • NeurIPS 2018 • Hyeji Kim, Yihan Jiang, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
The design of codes for communicating reliably over a statistically well defined channel is an important endeavor involving deep mathematical research and wide-ranging practical applications.
no code implementations • 28 Jun 2018 • Hyeji Kim, Chong-Min Kyung
In this paper, we define rank selection as a combinatorial optimization problem and propose a methodology to minimize network complexity while maintaining the desired accuracy.
Combinatorial Optimization
Vocal Bursts Intensity Prediction
3 code implementations • ICLR 2018 • Hyeji Kim, Yihan Jiang, Ranvir Rana, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
We show that creatively designed and trained RNN architectures can decode well known sequential codes such as the convolutional and turbo codes with close to optimal performance on the additive white Gaussian noise (AWGN) channel, which itself is achieved by breakthrough algorithms of our times (Viterbi and BCJR decoders, representing dynamic programing and forward-backward algorithms).
no code implementations • NeurIPS 2017 • Hyeji Kim, Weihao Gao, Sreeram Kannan, Sewoong Oh, Pramod Viswanath
Discovering a correlation from one variable to another variable is of fundamental scientific and practical interest.