1 code implementation • 4 Aug 2024 • Qinshuo Liu, Zixin Wang, Xi-An Li, Xinyao Ji, Lei Zhang, Lin Liu, Zhonghua Liu
Semiparametric statistics play a pivotal role in a wide range of domains, including but not limited to missing data, causal inference, and transfer learning, to name a few.
1 code implementation • 10 Dec 2021 • Xi-An Li, Zhi-Qin John Xu, Lei Zhang
Numerical results show that the SD$^2$NN model is superior to existing models such as MscaleDNN.
no code implementations • 30 Sep 2020 • Xi-An Li, Zhi-Qin John Xu, Lei Zhang
Algorithms based on deep neural networks (DNNs) have attracted increasing attention from the scientific computing community.
Computational Physics Analysis of PDEs
1 code implementation • NeurIPS 2020 • Xi-An Li, Asa Cooper Stickland, Yuqing Tang, Xiang Kong
As an extension of this framework, we propose a novel method to train one shared Transformer network for multilingual machine translation with different layer selection posteriors for each language pair.
1 code implementation • 28 Sep 2020 • Xi-An Li, Lei Zhang, Li-Yan Wang, Jian Lu
The proposed MRFGAT architecture is tested on ModelNet10 and ModelNet40 datasets, and results show it achieves state-of-the-art performance in shape classification tasks.
6 code implementations • 2 Aug 2020 • Yuqing Tang, Chau Tran, Xi-An Li, Peng-Jen Chen, Naman Goyal, Vishrav Chaudhary, Jiatao Gu, Angela Fan
Recent work demonstrates the potential of multilingual pretraining of creating one model that can be used for various tasks in different languages.
no code implementations • ACL 2020 • Arya D. McCarthy, Xi-An Li, Jiatao Gu, Ning Dong
This paper proposes a simple and effective approach to address the problem of posterior collapse in conditional variational autoencoders (CVAEs).
no code implementations • WS 2020 • Kenneth Heafield, Hiroaki Hayashi, Yusuke Oda, Ioannis Konstas, Andrew Finch, Graham Neubig, Xi-An Li, Alex Birch, ra
We describe the finding of the Fourth Workshop on Neural Generation and Translation, held in concert with the annual conference of the Association for Computational Linguistics (ACL 2020).
no code implementations • 24 Jun 2020 • Xin Luna Dong, Xiang He, Andrey Kan, Xi-An Li, Yan Liang, Jun Ma, Yifan Ethan Xu, Chenwei Zhang, Tong Zhao, Gabriel Blanco Saldana, Saurabh Deshpande, Alexandre Michetti Manduca, Jay Ren, Surender Pal Singh, Fan Xiao, Haw-Shiuan Chang, Giannis Karamanolakis, Yuning Mao, Yaqing Wang, Christos Faloutsos, Andrew McCallum, Jiawei Han
Can one build a knowledge graph (KG) for all products in the world?
1 code implementation • NeurIPS 2020 • Chau Tran, Yuqing Tang, Xi-An Li, Jiatao Gu
Recent studies have demonstrated the cross-lingual alignment ability of multilingual pretrained language models.
no code implementations • 15 Jun 2020 • Yaqing Wang, Yifan Ethan Xu, Xi-An Li, Xin Luna Dong, Jing Gao
(1) We formalize the problem of validating the textual attribute values of products from a variety of categories as a natural language inference task in the few-shot learning setting, and propose a meta-learning latent variable model to jointly process the signals obtained from product profiles and textual attribute values.
no code implementations • EACL 2021 • Asa Cooper Stickland, Xi-An Li, Marjan Ghazvininejad
For BART we get the best performance by freezing most of the model parameters, and adding extra positional embeddings.
8 code implementations • 22 Jan 2020 • Yinhan Liu, Jiatao Gu, Naman Goyal, Xi-An Li, Sergey Edunov, Marjan Ghazvininejad, Mike Lewis, Luke Zettlemoyer
This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks.
no code implementations • 19 Sep 2019 • Arya D. McCarthy, Xi-An Li, Jiatao Gu, Ning Dong
Posterior collapse plagues VAEs for text, especially for conditional text generation with strong autoregressive decoders.
2 code implementations • IJCNLP 2019 • Xuezhe Ma, Chunting Zhou, Xi-An Li, Graham Neubig, Eduard Hovy
Most sequence-to-sequence (seq2seq) models are autoregressive; they generate each token by conditioning on previously generated tokens.
Ranked #3 on
Machine Translation
on WMT2016 English-Romanian
no code implementations • 23 Jul 2019 • Junyang Gao, Xi-An Li, Yifan Ethan Xu, Bunyamin Sisman, Xin Luna Dong, Jun Yang
To address the problem, this paper proposes an efficient sampling and evaluation framework, which aims to provide quality accuracy evaluation with strong statistical guarantee while minimizing human efforts.
Databases
1 code implementation • WS 2019 • Xi-An Li, Paul Michel, Antonios Anastasopoulos, Yonatan Belinkov, Nadir Durrani, Orhan Firat, Philipp Koehn, Graham Neubig, Juan Pino, Hassan Sajjad
We share the findings of the first shared task on improving robustness of Machine Translation (MT).
no code implementations • ICLR 2019 • Paul Michel, Graham Neubig, Xi-An Li, Juan Miguel Pino
Adversarial examples have been shown to be an effective way of assessing the robustness of neural sequence-to-sequence (seq2seq) models, by applying perturbations to the input of a model leading to large degradation in performance.
1 code implementation • NAACL 2019 • Paul Michel, Xi-An Li, Graham Neubig, Juan Miguel Pino
Adversarial examples --- perturbations to the input of a model that elicit large changes in the output --- have been shown to be an effective way of assessing the robustness of sequence-to-sequence (seq2seq) models.
2 code implementations • LREC 2018 • Holger Schwenk, Xi-An Li
In addition, we have observed that the class prior distributions differ significantly between the languages.
no code implementations • IJCNLP 2017 • Xi-An Li, Peng Wang, Suixue Wang, Guanyu Jiang, Tianyuan You
Grammatical error diagnosis is an important task in natural language processing.
3 code implementations • ICML 2017 • Sercan O. Arik, Mike Chrzanowski, Adam Coates, Gregory Diamos, Andrew Gibiansky, Yongguo Kang, Xi-An Li, John Miller, Andrew Ng, Jonathan Raiman, Shubho Sengupta, Mohammad Shoeybi
We present Deep Voice, a production-quality text-to-speech system constructed entirely from deep neural networks.