Meta-learning considers the problem of learning an efficient learning process that can leverage its past experience to accurately solve new tasks.
Fast contextual adaptation has shown to be effective in improving Automatic Speech Recognition (ASR) of rare words and when combined with an on-device personalized training, it can yield an even better recognition result.
Humans can quickly associate stimuli to solve problems in novel contexts.
Ranked #1 on Question Answering on catbAbI LM-mode
We meta-train a transformer model on this distribution of tasks using a recent meta-learning framework.
Therefore, LoAIR is a step towards bridging the gap between econometrics, statistics, and machine learning by improving the predictive ability of linear regression without depreciating its interpretability.
We also develop task embeddings that can be used to predict the most transferable source tasks for a given target task, and we validate their effectiveness in experiments controlled for source and target data size.
We harness and extend a recently proposed machine reading comprehension (MRC) model to query for entity states, since these states are generally communicated in spans of text and MRC models perform well in extracting entity-centric spans.
Ranked #3 on Procedural Text Understanding on ProPara
Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications.
Ranked #1 on Text Simplification on PWKP / WikiSmall
We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons.
We examine the impact of a test set question's difficulty to determine if there is a relationship between difficulty and performance.
NTI constructs a full n-ary tree by processing the input text with its node function in a bottom-up fashion.
Ranked #43 on Natural Language Inference on SNLI