Search Results for author: Kyuyeon Hwang

Found 12 papers, 2 papers with code

Quantized neural network design under weight capacity constraint

no code implementations19 Nov 2016 Sungho Shin, Kyuyeon Hwang, Wonyong Sung

The complexity of deep neural network algorithms for hardware implementation can be lowered either by scaling the number of units or reducing the word-length of weights.

Quantization

Character-Level Language Modeling with Hierarchical Recurrent Neural Networks

no code implementations13 Sep 2016 Kyuyeon Hwang, Wonyong Sung

Recurrent neural network (RNN) based character-level language models (CLMs) are extremely useful for modeling out-of-vocabulary words by nature.

Language Modelling speech-recognition +1

Generative Knowledge Transfer for Neural Language Models

no code implementations14 Aug 2016 Sungho Shin, Kyuyeon Hwang, Wonyong Sung

In this paper, we propose a generative knowledge transfer technique that trains an RNN based language model (student network) using text and output probabilities generated from a previously trained RNN (teacher network).

Language Modelling Text Generation +1

Character-Level Incremental Speech Recognition with Recurrent Neural Networks

1 code implementation25 Jan 2016 Kyuyeon Hwang, Wonyong Sung

The output values of the CTC-trained RNN are character-level probabilities, which are processed by beam search decoding.

Language Modelling speech-recognition +1

Online Keyword Spotting with a Character-Level Recurrent Neural Network

no code implementations30 Dec 2015 Kyuyeon Hwang, Minjae Lee, Wonyong Sung

In this paper, we propose a context-aware keyword spotting model employing a character-level recurrent neural network (RNN) for spoken term detection in continuous speech.

General Classification Keyword Spotting

Structured Pruning of Deep Convolutional Neural Networks

1 code implementation29 Dec 2015 Sajid Anwar, Kyuyeon Hwang, Wonyong Sung

To decide the importance of network connections and paths, the proposed method uses a particle filtering approach.

Network Pruning

Fixed-Point Performance Analysis of Recurrent Neural Networks

no code implementations4 Dec 2015 Sungho Shin, Kyuyeon Hwang, Wonyong Sung

Recurrent neural networks have shown excellent performance in many applications, however they require increased complexity in hardware or software based implementations.

Language Modelling Quantization

Online Sequence Training of Recurrent Neural Networks with Connectionist Temporal Classification

no code implementations21 Nov 2015 Kyuyeon Hwang, Wonyong Sung

Our online model achieves 20. 7% phoneme error rate (PER) on the very long input sequence that is generated by concatenating all 192 utterances in the TIMIT core test set.

General Classification Rolling Shutter Correction +2

Resiliency of Deep Neural Networks under Quantization

no code implementations20 Nov 2015 Wonyong Sung, Sungho Shin, Kyuyeon Hwang

In this work, the effects of retraining are analyzed for a feedforward deep neural network (FFDNN) and a convolutional neural network (CNN).

Quantization

Single stream parallelization of generalized LSTM-like RNNs on a GPU

no code implementations10 Mar 2015 Kyuyeon Hwang, Wonyong Sung

Recurrent neural networks (RNNs) have shown outstanding performance on processing sequence data.

Cannot find the paper you are looking for? You can Submit a new open access paper.