Search Results for author: Yatin Chaudhary

Found 11 papers, 8 papers with code

Federated Continual Learning for Text Classification via Selective Inter-client Transfer

1 code implementation12 Oct 2022 Yatin Chaudhary, Pranav Rai, Matthias Schubert, Hinrich Schütze, Pankaj Gupta

The objective of Federated Continual Learning (FCL) is to improve deep learning models over life time at each client by (relevant and efficient) knowledge transfer without sharing data.

Continual Learning Federated Learning +3

Multi-source Neural Topic Modeling in Multi-view Embedding Spaces

1 code implementation NAACL 2021 Pankaj Gupta, Yatin Chaudhary, Hinrich Schütze

Though word embeddings and topics are complementary representations, several past works have only used pretrained word embeddings in (neural) topic modeling to address data sparsity in short-text or small collection of documents.

Information Retrieval Retrieval +1

Neural Topic Modeling with Continual Lifelong Learning

1 code implementation ICML 2020 Pankaj Gupta, Yatin Chaudhary, Thomas Runkler, Hinrich Schütze

To address the problem, we propose a lifelong learning framework for neural topic modeling that can continuously process streams of document collections, accumulate topics and guide future topic modeling tasks by knowledge transfer from several sources to better deal with the sparse data.

Data Augmentation Information Retrieval +2

Explainable and Discourse Topic-aware Neural Language Understanding

1 code implementation ICML 2020 Yatin Chaudhary, Hinrich Schütze, Pankaj Gupta

Marrying topic models and language models exposes language understanding to a broader source of document-level context beyond sentences via topics.

Document Classification Language Modelling +5

Lifelong Neural Topic Learning in Contextualized Autoregressive Topic Models of Language via Informative Transfers

no code implementations29 Sep 2019 Yatin Chaudhary, Pankaj Gupta, Thomas Runkler

in topic modeling, (2) A novel lifelong learning mechanism into neural topic modeling framework to demonstrate continuous learning in sequential document collections and minimizing catastrophic forgetting.

Data Augmentation Hallucination +2

Multi-source Multi-view Transfer Learning in Neural Topic Modeling with Pretrained Topic and Word Embeddings

no code implementations25 Sep 2019 Pankaj Gupta, Yatin Chaudhary, Hinrich Schütze

Though word embeddings and topics are complementary representations, several past works have only used pretrained word embeddings in (neural) topic modeling to address data sparsity problem in short text or small collection of documents.

Information Retrieval Retrieval +2

Multi-view and Multi-source Transfers in Neural Topic Modeling with Pretrained Topic and Word Embeddings

no code implementations14 Sep 2019 Pankaj Gupta, Yatin Chaudhary, Hinrich Schütze

Though word embeddings and topics are complementary representations, several past works have only used pre-trained word embeddings in (neural) topic modeling to address data sparsity problem in short text or small collection of documents.

Information Retrieval Retrieval +2

textTOvec: Deep Contextualized Neural Autoregressive Topic Models of Language with Distributed Compositional Prior

1 code implementation ICLR 2019 Pankaj Gupta, Yatin Chaudhary, Florian Buettner, Hinrich Schütze

We address two challenges of probabilistic topic modelling in order to better estimate the probability of a word in a given context, i. e., P(word|context): (1) No Language Structure in Context: Probabilistic topic models ignore word order by summarizing a given context as a "bag-of-word" and consequently the semantics of words in the context is lost.

Information Extraction Information Retrieval +4

Document Informed Neural Autoregressive Topic Models with Distributional Prior

1 code implementation15 Sep 2018 Pankaj Gupta, Yatin Chaudhary, Florian Buettner, Hinrich Schütze

Here, we extend a neural autoregressive topic model to exploit the full context information around words in a document in a language modeling fashion.

Language Modelling Retrieval +1

Cannot find the paper you are looking for? You can Submit a new open access paper.