Search Results for author: Ming Tu

Found 12 papers, 1 papers with code

Language-Universal Phonetic Representation in Multilingual Speech Pretraining for Low-Resource Speech Recognition

no code implementations19 May 2023 Siyuan Feng, Ming Tu, Rui Xia, Chuanzeng Huang, Yuxuan Wang

Moreover, on 3 of the 4 languages, comparing to the standard HuBERT, the approach performs better, meanwhile is able to save supervised training data by 1. 5k hours (75%) at most.

Self-Supervised Learning speech-recognition +1

Language-universal phonetic encoder for low-resource speech recognition

no code implementations19 May 2023 Siyuan Feng, Ming Tu, Rui Xia, Chuanzeng Huang, Yuxuan Wang

Our main approach and adaptation are effective on extremely low-resource languages, even within domain- and language-mismatched scenarios.

speech-recognition Speech Recognition

Graph Sequential Network for Reasoning over Sequences

no code implementations4 Apr 2020 Ming Tu, Jing Huang, Xiaodong He, Bo-Wen Zhou

We validate the proposed GSN on two NLP tasks: interpretable multi-hop reading comprehension on HotpotQA and graph based fact verification on FEVER.

Fact Verification Machine Reading Comprehension +1

Speaker-invariant Affective Representation Learning via Adversarial Training

no code implementations4 Nov 2019 Haoqi Li, Ming Tu, Jing Huang, Shrikanth Narayanan, Panayiotis Georgiou

In this paper, we propose a machine learning framework to obtain speech emotion representations by limiting the effect of speaker variability in the speech signals.

Emotion Classification Representation Learning +1

Select, Answer and Explain: Interpretable Multi-hop Reading Comprehension over Multiple Documents

1 code implementation1 Nov 2019 Ming Tu, Kevin Huang, Guangtao Wang, Jing Huang, Xiaodong He, Bo-Wen Zhou

Interpretable multi-hop reading comprehension (RC) over multiple documents is a challenging problem because it demands reasoning over multiple information sources and explaining the answer prediction by providing supporting evidences.

Learning-To-Rank Multi-Hop Reading Comprehension +2

Multiple instance learning with graph neural networks

no code implementations12 Jun 2019 Ming Tu, Jing Huang, Xiaodong He, Bo-Wen Zhou

In this paper, we propose a new end-to-end graph neural network (GNN) based algorithm for MIL: we treat each bag as a graph and use GNN to learn the bag embedding, in order to explore the useful structural information among instances in bags.

Multiple Instance Learning

Reducing the Model Order of Deep Neural Networks Using Information Theory

no code implementations16 May 2016 Ming Tu, Visar Berisha, Yu Cao, Jae-sun Seo

In this paper, we propose a method to compress deep neural networks by using the Fisher Information metric, which we estimate through a stochastic optimization method that keeps track of second-order information in the network.

General Classification Network Pruning +2

Cannot find the paper you are looking for? You can Submit a new open access paper.