Search Results for author: Hany Hassan

Found 20 papers, 5 papers with code

Building Multilingual Machine Translation Systems That Serve Arbitrary XY Translations

no code implementations NAACL 2022 Akiko Eriguchi, Shufang Xie, Tao Qin, Hany Hassan

Multilingual Neural Machine Translation (MNMT) enables one system to translate sentences from multiple source languages to multiple target languages, greatly reducing deployment costs compared with conventional bilingual systems.

Machine Translation Translation

Language Tokens: Simply Improving Zero-Shot Multi-Aligned Translation in Encoder-Decoder Models

no code implementations AMTA 2022 Muhammad N ElNokrashy, Amr Hendy, Mohamed Maher, Mohamed Afify, Hany Hassan

In a WMT-based setting, we see 1. 3 and 0. 4 BLEU points improvement for the zero-shot setting, and when using direct data for training, respectively, while from-English performance improves by 4. 17 and 0. 85 BLEU points.

Decoder Translation

Self-Exploring Language Models: Active Preference Elicitation for Online Alignment

1 code implementation29 May 2024 Shenao Zhang, Donghan Yu, Hiteshi Sharma, ZiYi Yang, Shuohang Wang, Hany Hassan, Zhaoran Wang

Preference optimization, particularly through Reinforcement Learning from Human Feedback (RLHF), has achieved significant success in aligning Large Language Models (LLMs) to adhere to human intentions.

Instruction Following

Deliberate then Generate: Enhanced Prompting Framework for Text Generation

no code implementations31 May 2023 Bei Li, Rui Wang, Junliang Guo, Kaitao Song, Xu Tan, Hany Hassan, Arul Menezes, Tong Xiao, Jiang Bian, Jingbo Zhu

Large language models (LLMs) have shown remarkable success across a wide range of natural language generation tasks, where proper prompt designs make great impacts.

Text Generation

Taming Sparsely Activated Transformer with Stochastic Experts

1 code implementation ICLR 2022 Simiao Zuo, Xiaodong Liu, Jian Jiao, Young Jin Kim, Hany Hassan, Ruofei Zhang, Tuo Zhao, Jianfeng Gao

While most on-going research focuses on improving SAMs models by exploring methods of routing inputs to experts, our analysis reveals that such research might not lead to the solution we expect, i. e., the commonly-used routing methods based on gating mechanisms do not work better than randomly routing inputs to experts.

Machine Translation Translation

Meta-Learning for Few-Shot NMT Adaptation

no code implementations WS 2020 Amr Sharaf, Hany Hassan, Hal Daumé III

We frame the adaptation of NMT systems as a meta-learning problem, where we learn to adapt to new unseen domains based on simulated offline meta-training domain adaptation tasks.

Domain Adaptation Machine Translation +3

From Research to Production and Back: Ludicrously Fast Neural Machine Translation

no code implementations WS 2019 Young Jin Kim, Marcin Junczys-Dowmunt, Hany Hassan, Alham Fikri Aji, Kenneth Heafield, Roman Grundkiewicz, Nikolay Bogoychev

Taking our dominating submissions to the previous edition of the shared task as a starting point, we develop improved teacher-student training via multi-agent dual-learning and noisy backward-forward translation for Transformer-based student models.

C++ code Decoder +2

Selecting, Planning, and Rewriting: A Modular Approach for Data-to-Document Generation and Translation

no code implementations WS 2019 Lesly Miculicich, Marc Marone, Hany Hassan

In this paper, we report our system submissions to all 6 tracks of the WNGT 2019 shared task on Document-Level Generation and Translation.

Language Modelling Translation

Multi-Source Cross-Lingual Model Transfer: Learning What to Share

1 code implementation ACL 2019 Xilun Chen, Ahmed Hassan Awadallah, Hany Hassan, Wei Wang, Claire Cardie

In this work, we focus on the multilingual transfer setting where training data in multiple source languages is leveraged to further boost target language performance.

Cross-Lingual NER text-classification +2

Zero-Resource Multilingual Model Transfer: Learning What to Share

no code implementations27 Sep 2018 Xilun Chen, Ahmed Hassan Awadallah, Hany Hassan, Wei Wang, Claire Cardie

In this work, we propose a zero-resource multilingual transfer learning model that can utilize training data in multiple source languages, while not requiring target language training data nor cross-lingual supervision.

Cross-Lingual Transfer text-classification +2

Gender Aware Spoken Language Translation Applied to English-Arabic

no code implementations26 Feb 2018 Mostafa Elaraby, Ahmed Y. Tawfik, Mahmoud Khaled, Hany Hassan, Aly Osama

One of the challenges of SLT is the translation from a language without gender agreement to a language with gender agreement such as English to Arabic.

Machine Translation NMT +1

Universal Neural Machine Translation for Extremely Low Resource Languages

no code implementations NAACL 2018 Jiatao Gu, Hany Hassan, Jacob Devlin, Victor O. K. Li

Our proposed approach utilizes a transfer-learning approach to share lexical and sentence level representations across multiple source languages into one target language.

Machine Translation Sentence +2

Synthetic Data for Neural Machine Translation of Spoken-Dialects

no code implementations IWSLT 2017 Hany Hassan, Mostafa ElAraby, Ahmed Tawfik

Our approach is language independent and can be used to generate data for any variant of the source language such as slang or spoken dialect or even for a different language that is closely related to the source language.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.