no code implementations • NAACL (AutoSimTrans) 2022 • Ruiqing Zhang, Chuanqiang Zhang, Zhongjun He, Hua Wu, Haifeng Wang, Liang Huang, Qun Liu, Julia Ive, Wolfgang Macherey
This paper reports the results of the shared task we hosted on the Third Workshop of Automatic Simultaneous Translation (AutoSimTrans).
1 code implementation • 11 Dec 2024 • Wei Yu Tang, Ning Dai, Tianshuo Zhou, David H. Mathews, Liang Huang
The task of RNA design given a target structure aims to find a sequence that can fold into that structure.
no code implementations • 23 Oct 2024 • Ning Dai, Zheng Wu, Renjie Zheng, Ziyun Wei, Wenlei Shi, Xing Jin, Guanlin Liu, Chen Dun, Liang Huang, Lin Yan
Reinforcement Learning (RL) with unit test feedback has enhanced large language models (LLMs) code generation, but relies on sparse rewards provided only after complete code evaluation, limiting learning efficiency and incremental improvements.
1 code implementation • 29 Dec 2023 • Ning Dai, Wei Yu Tang, Tianshuo Zhou, David H. Mathews, Liang Huang
We then use gradient descent-based optimization methods to improve the extended objective function, and the distribution will gradually shrink towards a one-hot sequence (i. e., a single sequence).
1 code implementation • 14 Nov 2023 • Tianshuo Zhou, Wei Yu Tang, David H. Mathews, Liang Huang
RNA design is the search for a sequence or set of sequences that will fold into predefined structures, also known as the inverse problem of RNA folding.
no code implementations • 18 Jul 2023 • Sizhen Li, Ning Dai, He Zhang, Apoorv Malik, David H. Mathews, Liang Huang
First, it takes $O(n^6)$ for two sequences where n is the average sequence length.
2 code implementations • 7 Nov 2022 • Xiaoran Fan, Chao Pang, Tian Yuan, He Bai, Renjie Zheng, Pengfei Zhu, Shuohuan Wang, Junkun Chen, Zeyu Chen, Liang Huang, Yu Sun, Hua Wu
In this paper, we extend the pretraining method for cross-lingual multi-speaker speech synthesis tasks, including cross-lingual multi-speaker voice cloning and cross-lingual multi-speaker speech editing.
no code implementations • 26 Oct 2022 • He Zhang, Sizhen Li, Liang Zhang, David H. Mathews, Liang Huang
Vienna RNAcofold, which reduces the problem into the classical single sequence folding by concatenating two strands, scales in cubic time against the combined sequence length, and is slow for long sequences.
1 code implementation • 29 Jun 2022 • Apoorv Malik, Liang Zhang, Milan Gautam, Ning Dai, Sizhen Li, He Zhang, David H. Mathews, Liang Huang
Predicting the consensus structure of a set of aligned RNA homologs is a convenient method to find conserved structures in an RNA genome, which has many applications including viral diagnostics and therapeutics.
2 code implementations • NAACL (ACL) 2022 • HUI ZHANG, Tian Yuan, Junkun Chen, Xintong Li, Renjie Zheng, Yuxin Huang, Xiaojie Chen, Enlei Gong, Zeyu Chen, Xiaoguang Hu, dianhai yu, Yanjun Ma, Liang Huang
PaddleSpeech is an open-source all-in-one speech toolkit.
Automatic Speech Recognition (ASR)
Environmental Sound Classification
+10
no code implementations • 16 May 2022 • Liang Huang, Senjie Liang, Feiyang Ye, Nan Gao
In this paper, we propose a Fast Attention Network (FAN) for joint intent detection and slot filling tasks, guaranteeing both accuracy and latency.
no code implementations • 27 Apr 2022 • Guangxu Xun, Mingbo Ma, Yuchen Bian, Xingyu Cai, Jiaji Huang, Renjie Zheng, Junkun Chen, Jiahong Yuan, Kenneth Church, Liang Huang
In simultaneous translation (SimulMT), the most widely used strategy is the wait-k policy thanks to its simplicity and effectiveness in balancing translation quality and latency.
no code implementations • 14 Apr 2022 • Chen Wang, Sida Chen, Liang Huang, Lianchun Yu
In this study, we used a whole-brain model to show that heterogeneity in nodal excitability had a significant impact on seizure propagation in the networks, and compromised the prediction accuracy with structural connections.
2 code implementations • 18 Mar 2022 • He Bai, Renjie Zheng, Junkun Chen, Xintong Li, Mingbo Ma, Liang Huang
Recently, speech representation learning has improved many speech-related tasks such as speech recognition, speech classification, and speech-to-text translation.
no code implementations • 13 Sep 2021 • Xiaoyi Zhou, Liang Huang, Tong Ye, Weiqiang Sun
This paper investigates an unmanned aerial vehicle (UAV)-assisted wireless powered mobile-edge computing (MEC) system, where the UAV powers the mobile terminals by wireless power transfer (WPT) and provides computation service for them.
no code implementations • 2 Aug 2021 • Jiahong Yuan, Xingyu Cai, Dongji Gao, Renjie Zheng, Liang Huang, Kenneth Church
Much of the recent literature on automatic speech recognition (ASR) is taking an end-to-end approach.
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+1
no code implementations • 2 Aug 2021 • Jiahong Yuan, Xingyu Cai, Renjie Zheng, Liang Huang, Kenneth Church
Models of phonemes, broad phonetic classes, and syllables all significantly outperform the utterance model, demonstrating that phonetic units are helpful and should be incorporated in speech emotion recognition.
1 code implementation • 21 Jul 2021 • Shuangli Li, Jingbo Zhou, Tong Xu, Liang Huang, Fan Wang, Haoyi Xiong, Weili Huang, Dejing Dou, Hui Xiong
To this end, we propose a structure-aware interactive graph neural network (SIGN) which consists of two components: polar-inspired graph attention layers (PGAL) and pairwise interactive pooling (PiPool).
Ranked #3 on
Protein-Ligand Affinity Prediction
on PDBbind
no code implementations • Findings (ACL) 2021 • Junkun Chen, Mingbo Ma, Renjie Zheng, Liang Huang
Simultaneous speech-to-text translation is widely useful in many scenarios.
no code implementations • 10 Feb 2021 • Renjie Zheng, Junkun Chen, Mingbo Ma, Liang Huang
Recently, representation learning for text and speech has successfully improved many language related tasks.
1 code implementation • RC 2020 • Damiaan J W Reijnaers, Daniël B van de Pavert, Giguru Scheuer, Liang Huang
Furthermore, we have created our own implementation of the algorithm in which we have incorporated additional experiments in order to evaluate the algorithmʼs relevance in the scope of different dimensionality reduction techniques and differently structured data.
1 code implementation • 17 Dec 2020 • Jingbo Zhou, Shuangli Li, Liang Huang, Haoyi Xiong, Fan Wang, Tong Xu, Hui Xiong, Dejing Dou
The hierarchical attentive aggregation can capture spatial dependencies among atoms, as well as fuse the position-enhanced information with the capability of discriminating multiple spatial relations among atoms.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Xiangci Li, Hairong Liu, Liang Huang
Existing natural language processing systems are vulnerable to noisy inputs resulting from misspellings.
no code implementations • EMNLP 2020 • Liang Huang, Colin Cherry, Mingbo Ma, Naveen Arivazhagan, Zhongjun He
Simultaneous translation, which performs translation concurrently with the source speech, is widely useful in many scenarios such as international conferences, negotiations, press releases, legal proceedings, and medicine.
no code implementations • 28 Oct 2020 • Zhuangzhi Chen, Hui Cui, Jingyang Xiang, Kunfeng Qiu, Liang Huang, Shilian Zheng, Shichuan Chen, Qi Xuan, Xiaoniu Yang
More interestingly, our proposed models behave extremely well in small-sample learning when only a small training dataset is provided.
no code implementations • 22 Oct 2020 • Junkun Chen, Mingbo Ma, Renjie Zheng, Liang Huang
End-to-end Speech-to-text Translation (E2E-ST), which directly translates source language speech to target language text, is widely useful in practice, but traditional cascaded approaches (ASR+MT) often suffer from error propagation in the pipeline.
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+4
no code implementations • EMNLP 2021 • Junkun Chen, Renjie Zheng, Atsuhito Kita, Mingbo Ma, Liang Huang
Simultaneous translation is vastly different from full-sentence translation, in the sense that it starts translation before the source sentence ends, with only a few words delay.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Renjie Zheng, Mingbo Ma, Baigong Zheng, Kaibo Liu, Jiahong Yuan, Kenneth Church, Liang Huang
Simultaneous speech-to-speech translation is widely useful but extremely challenging, since it needs to generate target-language speech concurrently with the source-language speech, with only a few seconds delay.
1 code implementation • 3 Oct 2020 • Suzhi Bi, Liang Huang, Hui Wang, Ying-Jun Angela Zhang
In particular, we aim to design an online computation offloading algorithm to maximize the network data processing capability subject to the long-term data queue stability and average power constraints.
Edge-computing
Networking and Internet Architecture
no code implementations • 18 May 2020 • Boxiang Liu, Liang Huang
We show that training on out-of-domain data and fine-tuning with as few as 4, 000 NEJM sentence pairs improve translation quality by 25. 3 (13. 4) BLEU for en$\to$zh (zh$\to$en) directions.
no code implementations • 3 May 2020 • Liang Huang, You Zhang, Weijian Pan, Jinyin Chen, Li Ping Qian, Yuan Wu
Extensive numerical results show both the CNN-based classifier and LSTM-based classifier extract similar radio features relating to modulation reference points.
no code implementations • ACL 2020 • Renjie Zheng, Mingbo Ma, Baigong Zheng, Kaibo Liu, Liang Huang
Simultaneous translation has many important application scenarios and attracts much attention from both academia and industry recently.
no code implementations • ACL 2020 • Baigong Zheng, Kaibo Liu, Renjie Zheng, Mingbo Ma, Hairong Liu, Liang Huang
Adaptive policies are better than fixed policies for simultaneous translation, since they can flexibly balance the tradeoff between translation quality and latency based on the current context information.
2 code implementations • 21 Apr 2020 • He Zhang, Liang Zhang, Ang Lin, Congcong Xu, Ziyu Li, Kaibo Liu, Boxiang Liu, Xiaopin Ma, Fanfan Zhao, Weiguo Yao, Hangwen Li, David H. Mathews, Yujian Zhang, Liang Huang
Messenger RNA (mRNA) vaccines are being used for COVID-19, but still suffer from the critical issue of mRNA instability and degradation, which is a major obstacle in the storage, distribution, and efficacy of the vaccine.
no code implementations • 6 Dec 2019 • Liang Huang, Weijian Pan, You Zhang, LiPing Qian, Nan Gao, Yuan Wu
Deep learning has recently been applied to automatically classify the modulation categories of received radio signals without manual experience.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Mingbo Ma, Baigong Zheng, Kaibo Liu, Renjie Zheng, Hairong Liu, Kainan Peng, Kenneth Church, Liang Huang
Text-to-speech synthesis (TTS) has witnessed rapid progress in recent years, where neural methods became capable of producing audios with high naturalness.
no code implementations • 3 Nov 2019 • Hairong Liu, Mingbo Ma, Liang Huang
The research in machine translation community focus on translation in text space.
no code implementations • IJCNLP 2019 • Renjie Zheng, Mingbo Ma, Baigong Zheng, Liang Huang
Beam search is universally used in full-sentence translation but its application to simultaneous translation remains non-trivial, where output words are committed on the fly.
no code implementations • IJCNLP 2019 • Baigong Zheng, Renjie Zheng, Mingbo Ma, Liang Huang
Simultaneous translation is widely useful but remains challenging.
no code implementations • WS 2019 • Renjie Zheng, Hairong Liu, Mingbo Ma, Baigong Zheng, Liang Huang
To make it worse, the amount of social media parallel corpora is extremely limited.
no code implementations • ACL 2019 • Baigong Zheng, Renjie Zheng, Mingbo Ma, Liang Huang
Simultaneous translation is widely useful but remains one of the most difficult tasks in NLP.
no code implementations • NAACL 2019 • Mingbo Ma, Renjie Zheng, Liang Huang
Beam search optimization resolves many issues in neural machine translation.
3 code implementations • ACL 2019 • Mingbo Ma, Liang Huang, Hao Xiong, Renjie Zheng, Kaibo Liu, Baigong Zheng, Chuanqiang Zhang, Zhongjun He, Hairong Liu, Xing Li, Hua Wu, Haifeng Wang
Simultaneous translation, which translates sentences before they are finished, is useful in many scenarios but is notoriously difficult due to word-order differences.
no code implementations • ACL 2019 • Hairong Liu, Mingbo Ma, Liang Huang, Hao Xiong, Zhongjun He
Neural machine translation (NMT) is notoriously sensitive to noises, but noises are almost inevitable in practice.
no code implementations • EMNLP 2018 • Wen Zhang, Liang Huang, Yang Feng, Lei Shen, Qun Liu
Although neural machine translation has achieved promising results, it suffers from slow translation speed.
no code implementations • EMNLP 2017 • Liang Huang, Kai Zhao, Mingbo Ma
In neural text generation such as neural machine translation, summarization, and image captioning, beam search is widely used to improve the output text quality.
no code implementations • WS 2018 • Renjie Zheng, Yilin Yang, Mingbo Ma, Liang Huang
This paper describes multimodal machine translation systems developed jointly by Oregon State University and Baidu Research for WMT 2018 Shared Task on multimodal translation.
no code implementations • EMNLP 2018 • Yilin Yang, Liang Huang, Mingbo Ma
Beam search is widely used in neural machine translation, and usually improves translation quality compared to greedy search.
no code implementations • EMNLP 2018 • Renjie Zheng, Mingbo Ma, Liang Huang
Neural text generation, including neural machine translation, image captioning, and summarization, has been quite successful recently.
no code implementations • EMNLP 2018 • Jiaji Huang, Yi Li, Wei Ping, Liang Huang
We propose a large margin criterion for training neural language models.
4 code implementations • 6 Aug 2018 • Liang Huang, Suzhi Bi, Ying-Jun Angela Zhang
To tackle this problem, we propose in this paper a Deep Reinforcement learning-based Online Offloading (DROO) framework that implements a deep neural network to generate offloading decisions.
Networking and Internet Architecture
no code implementations • ACL 2018 • Juneki Hong, Liang Huang
However, the minimal span parser of Stern et al (2017a) which holds the current state of the art accuracy is a chart parser running in cubic time, $O(n^3)$, which is too slow for longer sentences and for applications beyond sentence boundaries such as end-to-end discourse parsing and joint sentence boundary detection and parsing.
no code implementations • ACL 2016 • Reza Ghaeini, Xiaoli Z. Fern, Liang Huang, Prasad Tadepalli
Traditional event detection methods heavily rely on manually engineered rich features.
no code implementations • WS 2017 • Mingbo Ma, Dapeng Li, Kai Zhao, Liang Huang
This paper describes Oregon State University's submissions to the shared WMT'17 task "multimodal translation task I".
no code implementations • ACL 2017 • Mingbo Ma, Liang Huang, Bing Xiang, Bo-Wen Zhou
Question classification is an important task with wide applications.
no code implementations • 28 Sep 2017 • Mingbo Ma, Kai Zhao, Liang Huang, Bing Xiang, Bo-Wen Zhou
In order to utilize the potential benefits from their correlations, we propose a jointly trained model for learning the two tasks simultaneously via Long Short-Term Memory (LSTM) networks.
1 code implementation • EMNLP 2017 • Tianze Shi, Liang Huang, Lillian Lee
We first present a minimal feature set for transition-based dependency parsing, continuing a recent trend started by Kiperwasser and Goldberg (2016a) and Cross and Huang (2016a) of using bi-directional LSTM features.
1 code implementation • EMNLP 2017 • Kai Zhao, Liang Huang
Discourse parsing has long been treated as a stand-alone problem independent from constituency or dependency parsing.
1 code implementation • COLING 2016 • Kai Zhao, Liang Huang, Mingbo Ma
We show that it is beneficial to extend the attention model to tree nodes between premise and hypothesis.
1 code implementation • EMNLP 2016 • James Cross, Liang Huang
Parsing accuracy using efficient greedy transition systems has improved dramatically in recent years thanks to neural networks.
no code implementations • ACL 2016 • James Cross, Liang Huang
Recently, neural network approaches for parsing have largely automated the combination of individual features, but still rely on (often a larger number of) atomic features created from human linguistic intuition, and potentially omitting important global context.
1 code implementation • IJCNLP 2015 • Mingbo Ma, Liang Huang, Bing Xiang, Bo-Wen Zhou
In sentence modeling and classification, convolutional neural network approaches have recently achieved state-of-the-art results, but all such efforts process word vectors sequentially and neglect long-distance dependencies.
no code implementations • HLT 2015 • Kai Zhao, Liang Huang
Semantic parsing has made significant progress, but most current semantic parsers are extremely slow (CKY-based) and rather primitive in representation.
Ranked #4 on
Semantic Parsing
on ATIS