1 code implementation • 6 May 2021 • Tanner Bohn, Charles X. Ling
We present HARE, a new task where reader feedback is used to optimize document summaries for personal interest during the normal flow of reading.
no code implementations • 1 May 2021 • Charles X. Ling, Tanner Bohn
Thus, while our framework is still conceptual, and our experiment results are surely not SOTA, we hope that this unified lifelong learning framework inspires new work towards large-scale experiments and understanding human learning in general.
1 code implementation • COLING 2020 • Tanner Bohn, Charles X. Ling
To advance understanding on how to engage readers, we advocate the novel task of automatic pull quote selection.
no code implementations • 21 Nov 2019 • Charles X. Ling, Tanner Bohn
Humans can learn a variety of concepts and skills incrementally over the course of their lives while exhibiting many desirable properties, such as continual learning without forgetting, forward transfer and backward transfer of knowledge, and learning a new concept or task with only a few examples.
no code implementations • 4 Oct 2019 • Tanner Bohn, Yining Hu, Charles X. Ling
We present an image preprocessing technique capable of improving the performance of few-shot classifiers on abstract visual reasoning tasks.
2 code implementations • NeurIPS 2018 • Jun Wang, Tanner Bohn, Charles Ling
In this study, we propose an efficient architecture named PeleeNet, which is built with conventional convolution instead.
no code implementations • RANLP 2019 • Tanner Bohn, Yining Hu, Jinhang Zhang, Charles X. Ling
We present a novel and effective technique for performing text coherence tasks while facilitating deeper insights into the data.