no code implementations • EMNLP 2021 • Simeng Sun, Wenlong Zhao, Varun Manjunatha, Rajiv Jain, Vlad Morariu, Franck Dernoncourt, Balaji Vasan Srinivasan, Mohit Iyyer
While large-scale pretrained language models have significantly improved writing assistance functionalities such as autocomplete, more complex and controllable writing assistants have yet to be explored.
no code implementations • 20 Mar 2022 • Abhilasha Sancheti, Balaji Vasan Srinivasan, Rachel Rudinger
We introduce a new task of entailment relation aware paraphrase generation which aims at generating a paraphrase conforming to a given entailment relation (e. g. equivalent, forward entailing, or reverse entailing) with respect to a given input.
no code implementations • EMNLP 2021 • Vinay Aggarwal, Aparna Garimella, Balaji Vasan Srinivasan, Anandhavelu N, Rajiv Jain
We propose a two-staged pipeline to first predict if a specific clause type is relevant to be added in a contract, and then recommend the top clauses for the given type based on the contract context.
no code implementations • NAACL 2021 • Hrituraj Singh, Anshul Nasery, Denil Mehta, Aishwarya Agarwal, Jatin Lamba, Balaji Vasan Srinivasan
In this paper, we propose a novel task - MIMOQA - Multimodal Input Multimodal Output Question Answering in which the output is also multimodal.
1 code implementation • 14 Apr 2021 • Simeng Sun, Wenlong Zhao, Varun Manjunatha, Rajiv Jain, Vlad Morariu, Franck Dernoncourt, Balaji Vasan Srinivasan, Mohit Iyyer
While large-scale pretrained language models have significantly improved writing assistance functionalities such as autocomplete, more complex and controllable writing assistants have yet to be explored.
no code implementations • EACL 2021 • Hrituraj Singh, Gaurav Verma, Aparna Garimella, Balaji Vasan Srinivasan
In this paper, we propose a Director-Generator framework to rewrite content in the target author's style, specifically focusing on certain target attributes.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Hrituraj Singh, Gaurav Verma, Balaji Vasan Srinivasan
While recent advances in language modeling have resulted in powerful generation models, their generation style remains implicitly dependent on the training data and can not emulate a specific target style.
no code implementations • NAACL 2021 • Navita Goyal, Balaji Vasan Srinivasan, Anandhavelu Natarajan, Abhilasha Sancheti
Style transfer has been widely explored in natural language generation with non-parallel corpus by directly or indirectly extracting a notion of style from source and target domain corpus.
no code implementations • 11 May 2020 • Abhilasha Sancheti, Kundan Krishna, Balaji Vasan Srinivasan, Anandhavelu Natarajan
Style transfer deals with the algorithms to transfer the stylistic properties of a piece of text into that of another while ensuring that the core content is preserved.
2 code implementations • ACL 2020 • Akash Kumar Mohankumar, Preksha Nema, Sharan Narasimhan, Mitesh M. Khapra, Balaji Vasan Srinivasan, Balaraman Ravindran
To make attention mechanisms more faithful and plausible, we propose a modified LSTM cell with a diversity-driven training objective that ensures that the hidden representations learned at different time steps are diverse.
no code implementations • 18 Dec 2019 • Kushal Chawla, Hrituraj Singh, Arijit Pramanik, Mithlesh Kumar, Balaji Vasan Srinivasan
Recently, research efforts have gained pace to cater to varied user preferences while generating text summaries.
no code implementations • CONLL 2019 • Kushal Chawla, Balaji Vasan Srinivasan, Niyati Chhaya
Abstractive text summarization aims at generating human-like summaries by understanding and paraphrasing the given input content.
no code implementations • 22 Sep 2019 • Bakhtiyar Syed, Gaurav Verma, Balaji Vasan Srinivasan, Anandhavelu Natarajan, Vasudeva Varma
Given the recent progress in language modeling using Transformer-based neural models and an active interest in generating stylized text, we present an approach to leverage the generalization capabilities of a language model to rewrite an input text in a target author's style.
no code implementations • 18 Sep 2019 • Gaurav Verma, Balaji Vasan Srinivasan
With a growing interest in modeling inherent subjectivity in natural language, we present a linguistically-motivated process to understand and analyze the writing style of individuals from three perspectives: lexical, syntactic, and semantic.
1 code implementation • IJCNLP 2019 • Preksha Nema, Akash Kumar Mohankumar, Mitesh M. Khapra, Balaji Vasan Srinivasan, Balaraman Ravindran
It is desired that the generated question should be (i) grammatically correct (ii) answerable from the passage and (iii) specific to the given answer.
no code implementations • 20 Jan 2019 • Kushal Chawla, Kundan Krishna, Balaji Vasan Srinivasan
The first shortcoming is the extractive nature of the generated summaries, since the network eventually learns to copy from the input article most of the times, affecting the abstractive nature of the generated summaries.
no code implementations • EMNLP 2018 • Raghuram Vadapalli, Bakhtiyar Syed, Nishant Prabhu, Balaji Vasan Srinivasan, Vasudeva Varma
We present an online interactive tool that generates titles of blog titles and thus take the first step toward automating science journalism.
no code implementations • COLING 2018 • Kundan Krishna, Aniket Murhekar, Saumitra Sharma, Balaji Vasan Srinivasan
Neural sequence-to-sequence models have been successfully extended for summary generation. However, existing frameworks generate a single summary for a given input and do not tune the summaries towards any additional constraints/preferences.
no code implementations • COLING 2018 • Balaji Vasan Srinivasan, Pranav Maneriker, Kundan Krishna, Natwar Modani
Enterprise content writers are engaged in writing textual content for various purposes.
no code implementations • NAACL 2018 • Kundan Krishna, Balaji Vasan Srinivasan
Existing summarization algorithms generate a single summary and are not capable of generating multiple summaries tuned to the interests of the readers.