Search Results for author: Qing Sun

Found 11 papers, 3 papers with code

Learning to Revise References for Faithful Summarization

1 code implementation13 Apr 2022 Griffin Adams, Han-Chin Shing, Qing Sun, Christopher Winestock, Kathleen McKeown, Noémie Elhadad

We extract a small corpus from a noisy source--the Electronic Health Record (EHR)--for the task of summarizing a hospital admission from multiple notes.

Contrastive Learning

Rethinking Rehearsal in Lifelong Learning: Does An Example Contribute the Plasticity or Stability?

no code implementations29 Sep 2021 Qing Sun, Fan Lyu, Fanhua Shang, Wei Feng, Liang Wan

Traditionally, the primary goal of LL is to achieve the trade-off between the Stability (remembering past tasks) and Plasticity (adapting to new tasks).

Multi-Task Learning

Amortized Posterior on Latent Variables in Gaussian Process

no code implementations29 Sep 2021 Qing Sun

Deep neural networks have achieved impressive performance on a variety of domains.

Neural Entity Recognition with Gazetteer based Fusion

no code implementations Findings (ACL) 2021 Qing Sun, Parminder Bhatia

Our gazetteer based fusion model is data efficient, achieving +1. 7 micro-F1 gains on the i2b2 dataset using 20% training data, and brings + 4. 7 micro-F1 gains on novel entity mentions never presented during training.

named-entity-recognition NER

An Empirical Investigation Towards Efficient Multi-Domain Language Model Pre-training

1 code implementation EMNLP 2020 Kristjan Arumae, Qing Sun, Parminder Bhatia

However, in order to achieve state-of-the-art performance on out of domain tasks such as clinical named entity recognition and relation extraction, additional in domain pre-training is required.

Language Modelling named-entity-recognition +3

Learn to Talk via Proactive Knowledge Transfer

no code implementations23 Aug 2020 Qing Sun, James Cross

In this paper, we provide an in-depth analysis of KL-divergence minimization in Forward and Backward orders, which shows that learners are reinforced via on-policy learning in Backward.

Knowledge Distillation Machine Translation +2

Proactive Sequence Generator via Knowledge Acquisition

no code implementations25 Sep 2019 Qing Sun, James Cross, Dmitriy Genzel

Sequence-to-sequence models such as transformers, which are now being used in a wide variety of NLP tasks, typically need to have very high capacity in order to perform well.

Knowledge Distillation

Bidirectional Beam Search: Forward-Backward Inference in Neural Sequence Models for Fill-in-the-Blank Image Captioning

no code implementations CVPR 2017 Qing Sun, Stefan Lee, Dhruv Batra

We develop the first approximate inference algorithm for 1-Best (and M-Best) decoding in bidirectional neural sequence models by extending Beam Search (BS) to reason about both forward and backward time dependencies.

Image Captioning

Diverse Beam Search: Decoding Diverse Solutions from Neural Sequence Models

20 code implementations7 Oct 2016 Ashwin K. Vijayakumar, Michael Cogswell, Ramprasath R. Selvaraju, Qing Sun, Stefan Lee, David Crandall, Dhruv Batra

We observe that our method consistently outperforms BS and previously proposed techniques for diverse decoding from neural sequence models.

Image Captioning Machine Translation +3

SubmodBoxes: Near-Optimal Search for a Set of Diverse Object Proposals

no code implementations NeurIPS 2015 Qing Sun, Dhruv Batra

This paper formulates the search for a set of bounding boxes (as needed in object proposal generation) as a monotone submodular maximization problem over the space of all possible bounding boxes in an image.

Object Proposal Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.