no code implementations • 19 Nov 2019 • Omar U. Florez, Erik Mueller
Our proposed solution is straightforward and simple: we train a controller to execute an optimal sequence of reading and writing operations on an external memory with the goal of leveraging diverse activations from the past and provide accurate predictions.
no code implementations • 19 Nov 2019 • Omar U. Florez, Erik Mueller
By storing KB embeddings into a memory component, these models can learn meaningful representations that are grounded to external knowledge.
no code implementations • 13 Jun 2019 • Chris Larson, Tarek Lahlou, Diana Mingels, Zachary Kulis, Erik Mueller
While techniques exist for desensitizing features to common noise patterns produced by Speech-to-Text (STT) and Text-to-Speech (TTS) systems, the question remains how to best leverage state-of-the-art language models (which capture rich semantic features, but are trained on only written text) on inputs with ASR errors.