108 papers with code • 0 benchmarks • 5 datasets
To build a high-quality open-domain chatbot, we introduce the effective training process of PLATO-2 via curriculum learning.
Ranked #1 on Chatbot on 10 Monkey Species (using extra training data)
Machine translation systems based on deep neural networks are expensive to train.
Recent deep learning approaches to single image super-resolution have achieved impressive results in terms of traditional error measures and perceptual quality.
Ranked #9 on Image Super-Resolution on BSD100 - 4x upscaling
In this work, we propose CARLS, a novel framework for augmenting the capacity of existing deep learning frameworks by enabling multiple components -- model trainers, knowledge makers and knowledge banks -- to concertedly work together in an asynchronous fashion across hardware platforms.
In this paper, we propose an image super-resolution feedback network (SRFBN) to refine low-level representations with high-level information.
Common nonlinear activation functions used in neural networks can cause training difficulties due to the saturation behavior of the activation function, which may hide dependencies that are not visible to vanilla-SGD (using first order gradients only).