Search Results for author: Ming Tan

Found 15 papers, 4 papers with code

DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization

no code implementations ACL 2022 Zheng Li, Zijian Wang, Ming Tan, Ramesh Nallapati, Parminder Bhatia, Andrew Arnold, Bing Xiang, Dan Roth

Empirical analyses show that, despite the challenging nature of generative tasks, we were able to achieve a 16. 5x model footprint compression ratio with little performance drop relative to the full-precision counterparts on multiple summarization and QA datasets.

Quantization Text Generation

New Benchmark for Household Garbage Image Recognition

no code implementations24 Feb 2022 Zhize Wu, Huanyi Li, XiaoFeng Wang, Zijun Wu, Le Zou, Lixiang Xu, Ming Tan

Household garbage images are usually faced with complex backgrounds, variable illuminations, diverse angles, and changeable shapes, which bring a great difficulty in garbage image classification.

Classification Image Classification +1

Skeleton Based Action Recognition using a Stacked Denoising Autoencoder with Constraints of Privileged Information

no code implementations12 Mar 2020 Zhize Wu, Thomas Weise, Le Zou, Fei Sun, Ming Tan

Differing from the previous studies, we propose a new method called Denoising Autoencoder with Temporal and Categorical Constraints (DAE_CTC)} to study the skeletal representation in a view of skeleton reconstruction.

Action Recognition Denoising +1

Context-Aware Conversation Thread Detection in Multi-Party Chat

no code implementations IJCNLP 2019 Ming Tan, Dakuo Wang, Yupeng Gao, Haoyu Wang, Saloni Potdar, Xiaoxiao Guo, Shiyu Chang, Mo Yu

In multi-party chat, it is common for multiple conversations to occur concurrently, leading to intermingled conversation threads in chat logs.

Group Chat Ecology in Enterprise Instant Messaging: How Employees Collaborate Through Multi-User Chat Channels on Slack

no code implementations4 Jun 2019 Dakuo Wang, Haoyu Wang, Mo Yu, Zahra Ashktorab, Ming Tan

We cross-referenced 117 project teams and their team-based Slack channels and identified 57 teams that appeared in both datasets, then we built a regression model to reveal the relationship between these group communication styles and the project team performance.

FastHybrid: A Hybrid Model for Efficient Answer Selection

no code implementations COLING 2016 Lidan Wang, Ming Tan, Jiawei Han

In this paper, we propose an extremely efficient hybrid model (FastHybrid) that tackles the problem from both an accuracy and scalability point of view.

Answer Selection Information Retrieval +1

Attentive Pooling Networks

3 code implementations11 Feb 2016 Cicero dos Santos, Ming Tan, Bing Xiang, Bo-Wen Zhou

In this work, we propose Attentive Pooling (AP), a two-way attention mechanism for discriminative model training.

Answer Selection Representation Learning

LSTM-based Deep Learning Models for Non-factoid Answer Selection

2 code implementations12 Nov 2015 Ming Tan, Cicero dos Santos, Bing Xiang, Bo-Wen Zhou

One direction is to define a more composite representation for questions and answers by combining convolutional neural network with the basic framework.

Answer Selection

Direct 0-1 Loss Minimization and Margin Maximization with Boosting

no code implementations NeurIPS 2013 Shaodan Zhai, Tian Xia, Ming Tan, Shaojun Wang

We propose a boosting method, DirectBoost, a greedy coordinate descent algorithm that builds an ensemble classifier of weak classifiers through directly minimizing empirical classification error over labeled training examples; once the training classification error is reduced to a local coordinatewise minimum, DirectBoost runs a greedy coordinate ascent algorithm that continuously adds weak classifiers to maximize any targeted arbitrarily defined margins until reaching a local coordinatewise maximum of the margins in a certain sense.

General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.