Search Results for author: Zhenhao Li

Found 15 papers, 5 papers with code

NL-Augmenter: A Framework for Task-Sensitive Natural Language Augmentation

2 code implementations6 Dec 2021 Kaustubh D. Dhole, Varun Gangal, Sebastian Gehrmann, Aadesh Gupta, Zhenhao Li, Saad Mahamood, Abinaya Mahendiran, Simon Mille, Ashish Shrivastava, Samson Tan, Tongshuang Wu, Jascha Sohl-Dickstein, Jinho D. Choi, Eduard Hovy, Ondrej Dusek, Sebastian Ruder, Sajant Anand, Nagender Aneja, Rabin Banjade, Lisa Barthe, Hanna Behnke, Ian Berlot-Attwell, Connor Boyle, Caroline Brun, Marco Antonio Sobrevilla Cabezudo, Samuel Cahyawijaya, Emile Chapuis, Wanxiang Che, Mukund Choudhary, Christian Clauss, Pierre Colombo, Filip Cornell, Gautier Dagan, Mayukh Das, Tanay Dixit, Thomas Dopierre, Paul-Alexis Dray, Suchitra Dubey, Tatiana Ekeinhor, Marco Di Giovanni, Tanya Goyal, Rishabh Gupta, Louanes Hamla, Sang Han, Fabrice Harel-Canada, Antoine Honore, Ishan Jindal, Przemyslaw K. Joniak, Denis Kleyko, Venelin Kovatchev, Kalpesh Krishna, Ashutosh Kumar, Stefan Langer, Seungjae Ryan Lee, Corey James Levinson, Hualou Liang, Kaizhao Liang, Zhexiong Liu, Andrey Lukyanenko, Vukosi Marivate, Gerard de Melo, Simon Meoni, Maxime Meyer, Afnan Mir, Nafise Sadat Moosavi, Niklas Muennighoff, Timothy Sum Hon Mun, Kenton Murray, Marcin Namysl, Maria Obedkova, Priti Oli, Nivranshu Pasricha, Jan Pfister, Richard Plant, Vinay Prabhu, Vasile Pais, Libo Qin, Shahab Raji, Pawan Kumar Rajpoot, Vikas Raunak, Roy Rinberg, Nicolas Roberts, Juan Diego Rodriguez, Claude Roux, Vasconcellos P. H. S., Ananya B. Sai, Robin M. Schmidt, Thomas Scialom, Tshephisho Sefara, Saqib N. Shamsi, Xudong Shen, Haoyue Shi, Yiwen Shi, Anna Shvets, Nick Siegel, Damien Sileo, Jamie Simon, Chandan Singh, Roman Sitelew, Priyank Soni, Taylor Sorensen, William Soto, Aman Srivastava, KV Aditya Srivatsa, Tony Sun, Mukund Varma T, A Tabassum, Fiona Anting Tan, Ryan Teehan, Mo Tiwari, Marie Tolkiehn, Athena Wang, Zijian Wang, Gloria Wang, Zijie J. Wang, Fuxuan Wei, Bryan Wilie, Genta Indra Winata, Xinyi Wu, Witold Wydmański, Tianbao Xie, Usama Yaseen, Michael A. Yee, Jing Zhang, Yue Zhang

Data augmentation is an important component in the robustness evaluation of models in natural language processing (NLP) and in enhancing the diversity of the data they are trained on.

Data Augmentation

Visual Cues and Error Correction for Translation Robustness

1 code implementation Findings (EMNLP) 2021 Zhenhao Li, Marek Rei, Lucia Specia

Neural Machine Translation models are sensitive to noise in the input texts, such as misspelled words and ungrammatical constructions.

Machine Translation Translation

Test-Time Personalization with Meta Prompt for Gaze Estimation

1 code implementation3 Jan 2024 Huan Liu, Julia Qi, Zhenhao Li, Mohammad Hassanpour, Yang Wang, Konstantinos Plataniotis, Yuanhao Yu

Despite the recent remarkable achievement in gaze estimation, efficient and accurate personalization of gaze estimation without labels is a practical problem but rarely touched on in the literature.

Gaze Estimation

Improving Neural Machine Translation Robustness via Data Augmentation: Beyond Back-Translation

1 code implementation WS 2019 Zhenhao Li, Lucia Specia

Neural Machine Translation (NMT) models have been proved strong when translating clean texts, but they are very sensitive to noise in the input.

Data Augmentation Domain Adaptation +3

Exploring Model Consensus to Generate Translation Paraphrases

1 code implementation WS 2020 Zhenhao Li, Marina Fomicheva, Lucia Specia

This paper describes our submission to the 2020 Duolingo Shared Task on Simultaneous Translation And Paraphrase for Language Education (STAPLE).

Machine Translation Translation

Video Restoration Against Yin-Yang Phasing

no code implementations ICCV 2015 Xiaolin Wu, Zhenhao Li, Xiaowei Deng

A common video degradation problem, which is largely untreated in literature, is what we call Yin-Yang Phasing (YYP).

Tone Mapping Video Restoration

Improving Neural Machine Translation Robustness via Data Augmentation: Beyond Back Translation

no code implementations7 Oct 2019 Zhenhao Li, Lucia Specia

Neural Machine Translation (NMT) models have been proved strong when translating clean texts, but they are very sensitive to noise in the input.

Data Augmentation Domain Adaptation +3

Design and Performance Evaluation of Joint Sensing and Communication Integrated System for 5G MmWave Enabled CAVs

no code implementations24 Aug 2021 Qixun Zhang, Xinna Wang, Zhenhao Li, Zhiqing Wei

Finally, both simulation and hardware testbed are designed, and the results show that the proposed CTRA algorithm can improve the radar total mutual information by 26%, and the feasibility of the proposed JSCIS is achieved with an acceptable radar ranging accuracy within 0. 25 m, as well as a stable data rate of 2. 8 Gbps using the 28 GHz mmWave frequency band.

Towards a Better Understanding of Noise in Natural Language Processing

no code implementations RANLP 2021 Khetam Al Sharou, Zhenhao Li, Lucia Specia

In this paper, we propose a definition and taxonomy of various types of non-standard textual content – generally referred to as “noise” – in Natural Language Processing (NLP).

Findings of the WMT 2021 Shared Task on Quality Estimation

no code implementations WMT (EMNLP) 2021 Lucia Specia, Frédéric Blain, Marina Fomicheva, Chrysoula Zerva, Zhenhao Li, Vishrav Chaudhary, André F. T. Martins

We report the results of the WMT 2021 shared task on Quality Estimation, where the challenge is to predict the quality of the output of neural machine translation systems at the word and sentence levels.

Machine Translation Sentence +1

ICL’s Submission to the WMT21 Critical Error Detection Shared Task

no code implementations WMT (EMNLP) 2021 Genze Jiang, Zhenhao Li, Lucia Specia

This paper presents Imperial College London’s submissions to the WMT21 Quality Estimation (QE) Shared Task 3: Critical Error Detection.

Feature Engineering

Ensemble Multi-Relational Graph Neural Networks

no code implementations24 May 2022 Yuling Wang, Hao Xu, Yanhua Yu, Mengdi Zhang, Zhenhao Li, Yuji Yang, Wei Wu

This EMR optimization objective is able to derive an iterative updating rule, which can be formalized as an ensemble message passing (EnMP) layer with multi-relations.

Evaluating Large Language Models with Runtime Behavior of Program Execution

no code implementations25 Mar 2024 Junkai Chen, Zhiyuan Pan, Xing Hu, Zhenhao Li, Ge Li, Xin Xia

Typically, they focus on predicting the input and output of a program, ignoring the evaluation of the intermediate behavior during program execution, as well as the logical consistency (e. g., the model should not give the correct output if the prediction of execution path is wrong) when performing the reasoning.

Cannot find the paper you are looking for? You can Submit a new open access paper.