Search Results for author: Junyoung Chung

Found 9 papers, 6 papers with code

Step-unrolled Denoising Autoencoders for Text Generation

1 code implementation13 Dec 2021 Nikolay Savinov, Junyoung Chung, Mikolaj Binkowski, Erich Elsen, Aaron van den Oord

In this paper we propose a new generative model of text, Step-unrolled Denoising Autoencoder (SUNDAE), that does not rely on autoregressive models.

Denoising Language Modelling +2

Hierarchical Multiscale Recurrent Neural Networks

3 code implementations6 Sep 2016 Junyoung Chung, Sungjin Ahn, Yoshua Bengio

Multiscale recurrent neural networks have been considered as a promising approach to resolve this issue, yet there has been a lack of empirical evidence showing that this type of models can actually capture the temporal dependencies by discovering the latent hierarchical structure of the sequence.

Language Modelling

A Character-Level Decoder without Explicit Segmentation for Neural Machine Translation

2 code implementations ACL 2016 Junyoung Chung, Kyunghyun Cho, Yoshua Bengio

The existing machine translation systems, whether phrase-based or neural, have relied almost exclusively on word-level modelling with explicit segmentation.

Machine Translation Translation

Iterative Refinement of the Approximate Posterior for Directed Belief Networks

1 code implementation NeurIPS 2016 R. Devon Hjelm, Kyunghyun Cho, Junyoung Chung, Russ Salakhutdinov, Vince Calhoun, Nebojsa Jojic

Variational methods that rely on a recognition network to approximate the posterior of directed graphical models offer better inference and learning than previous methods.

Detecting Interrogative Utterances with Recurrent Neural Networks

no code implementations3 Nov 2015 Junyoung Chung, Jacob Devlin, Hany Hassan Awadalla

In this paper, we explore different neural network architectures that can predict if a speaker of a given utterance is asking a question or making a statement.

General Classification

A Recurrent Latent Variable Model for Sequential Data

5 code implementations NeurIPS 2015 Junyoung Chung, Kyle Kastner, Laurent Dinh, Kratarth Goel, Aaron Courville, Yoshua Bengio

In this paper, we explore the inclusion of latent random variables into the dynamic hidden state of a recurrent neural network (RNN) by combining elements of the variational autoencoder.

Gated Feedback Recurrent Neural Networks

no code implementations9 Feb 2015 Junyoung Chung, Caglar Gulcehre, Kyunghyun Cho, Yoshua Bengio

In this work, we propose a novel recurrent neural network (RNN) architecture.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.