no code implementations • 19 Jan 2024 • Tian Shi, Changyun Wen, Yongping Pan
This paper proposes a composite learning backstepping control (CLBC) strategy based on modular backstepping and high-order tuners to compensate for the transient process of parameter estimation and achieve closed-loop exponential stability without the nonlinear damping terms and the PE condition.
no code implementations • 26 Apr 2022 • Wenlong Zhang, Bhagyashree Ingale, Hamza Shabir, Tianyi Li, Tian Shi, Ping Wang
ED Explorer consists of an interactive web application, an API, and an NLP toolkit, which can help both domain experts and non-experts to better understand the ED task.
no code implementations • 1 Aug 2021 • Ping Wang, Tian Shi, Khushbu Agarwal, Sutanay Choudhury, Chandan K. Reddy
On the other hand, the aspects, entity and context, limit the answers by node-specific information and lead to higher precision and lower recall.
Knowledge Base Question Answering Machine Reading Comprehension
1 code implementation • 18 Sep 2020 • Tian Shi, Liuqing Li, Ping Wang, Chandan K. Reddy
However, recent deep learning-based topic models, specifically aspect-based autoencoder, suffer from several problems, such as extracting noisy aspects and poorly mapping aspects discovered by models to the aspects of interest.
2 code implementations • 18 Sep 2020 • Tian Shi, Ping Wang, Chandan K. Reddy
In addition, we also propose an Attention-driven Keywords Ranking (AKR) method, which can automatically discover aspect keywords and aspect-level opinion keywords from the review corpus based on the attention weights.
1 code implementation • 24 Apr 2020 • Tian Shi, Xuchao Zhang, Ping Wang, Chandan K. Reddy
In this paper, we propose a corpus-level explanation approach, which aims to capture causal relationships between keywords and model predictions via learning the importance of keywords for predicted labels across a training corpus based on attention weights.
1 code implementation • 28 Jul 2019 • Ping Wang, Tian Shi, Chandan K. Reddy
In this paper, we tackle these challenges by developing a deep learning based TRanslate-Edit Model for Question-to-SQL (TREQS) generation, which adapts the widely used sequence-to-sequence model to directly generate the SQL query for a given question, and further performs the required edits using an attentive-copying mechanism and task-specific look-up tables.
1 code implementation • NAACL 2019 • Tian Shi, Ping Wang, Chandan K. Reddy
Neural abstractive text summarization (NATS) has received a lot of attention in the past few years from both industry and academia.
5 code implementations • 5 Dec 2018 • Tian Shi, Yaser Keneshloo, Naren Ramakrishnan, Chandan K. Reddy
As part of this survey, we also develop an open source library, namely, Neural Abstractive Text Summarizer (NATS) toolkit, for the abstractive text summarization.
3 code implementations • 24 May 2018 • Yaser Keneshloo, Tian Shi, Naren Ramakrishnan, Chandan K. Reddy
In this survey, we consider seq2seq problems from the RL point of view and provide a formulation combining the power of RL methods in decision-making with sequence-to-sequence models that enable remembering long-term memories.