Search Results for author: Trapit Bansal

Found 16 papers, 5 papers with code

Diverse Distributions of Self-Supervised Tasks for Meta-Learning in NLP

no code implementations EMNLP 2021 Trapit Bansal, Karthick Gunasekaran, Tong Wang, Tsendsuren Munkhdalai, Andrew McCallum

Meta-learning considers the problem of learning an efficient learning process that can leverage its past experience to accurately solve new tasks.

Few-Shot Learning

Unsupervised Pre-training for Biomedical Question Answering

no code implementations27 Sep 2020 Vaishnavi Kommaraju, Karthick Gunasekaran, Kun Li, Trapit Bansal, Andrew McCallum, Ivana Williams, Ana-Maria Istrate

We explore the suitability of unsupervised representation learning methods on biomedical text -- BioBERT, SciBERT, and BioSentVec -- for biomedical question answering.

Question Answering Representation Learning +1

Learning to Few-Shot Learn Across Diverse Natural Language Classification Tasks

2 code implementations COLING 2020 Trapit Bansal, Rishikesh Jha, Andrew McCallum

LEOPARD is trained with the state-of-the-art transformer architecture and shows better generalization to tasks not seen at all during training, with as few as 4 examples per label.

Entity Typing Few-Shot Learning +6

A2N: Attending to Neighbors for Knowledge Graph Inference

no code implementations ACL 2019 Trapit Bansal, Da-Cheng Juan, Sujith Ravi, Andrew McCallum

State-of-the-art models for knowledge graph completion aim at learning a fixed embedding representation of entities in a multi-relational graph which can generalize to infer unseen entity relationships at test time.

Knowledge Graph Completion Link Prediction

Emergent Complexity via Multi-Agent Competition

2 code implementations ICLR 2018 Trapit Bansal, Jakub Pachocki, Szymon Sidor, Ilya Sutskever, Igor Mordatch

In this paper, we point out that a competitive multi-agent environment trained with self-play can produce behaviors that are far more complex than the environment itself.

Blocking

Continuous Adaptation via Meta-Learning in Nonstationary and Competitive Environments

1 code implementation ICLR 2018 Maruan Al-Shedivat, Trapit Bansal, Yuri Burda, Ilya Sutskever, Igor Mordatch, Pieter Abbeel

Ability to continuously learn and adapt from limited experience in nonstationary environments is an important milestone on the path towards general intelligence.

Meta-Learning

Low-Rank Hidden State Embeddings for Viterbi Sequence Labeling

no code implementations2 Aug 2017 Dung Thai, Shikhar Murty, Trapit Bansal, Luke Vilnis, David Belanger, Andrew McCallum

In textual information extraction and other sequence labeling tasks it is now common to use recurrent neural networks (such as LSTM) to form rich embedded representations of long-term input co-occurrence patterns.

named-entity-recognition Named Entity Recognition +1

A provable SVD-based algorithm for learning topics in dominant admixture corpus

no code implementations NeurIPS 2014 Trapit Bansal, Chiranjib Bhattacharyya, Ravindran Kannan

Our aim is to develop a model which makes intuitive and empirically supported assumptions and to design an algorithm with natural, simple components such as SVD, which provably solves the inference problem for the model with bounded $l_1$ error.

Topic Models

Cannot find the paper you are looking for? You can Submit a new open access paper.