Search Results for author: Sathish Reddy Indurthi

Found 12 papers, 1 papers with code

Language Model Augmented Monotonic Attention for Simultaneous Translation

no code implementations NAACL 2022 Sathish Reddy Indurthi, Mohd Abbas Zaidi, Beomseok Lee, Nikhil Kumar Lakumarapu, Sangha Kim

The state-of-the-art adaptive policies for Simultaneous Neural Machine Translation (SNMT) use monotonic attention to perform read/write decisions based on the partial source and target sequences.

Language Modeling Language Modelling +5

Instructional Segment Embedding: Improving LLM Safety with Instruction Hierarchy

no code implementations9 Oct 2024 Tong Wu, Shujian Zhang, Kaiqiang Song, Silei Xu, Sanqiang Zhao, Ravi Agrawal, Sathish Reddy Indurthi, Chong Xiang, Prateek Mittal, Wenxuan Zhou

Modern LLM architectures treat all inputs equally, failing to distinguish between and prioritize various types of instructions, such as system messages, user prompts, and data.

Instruction Following

WPO: Enhancing RLHF with Weighted Preference Optimization

1 code implementation17 Jun 2024 Wenxuan Zhou, Ravi Agrawal, Shujian Zhang, Sathish Reddy Indurthi, Sanqiang Zhao, Kaiqiang Song, Silei Xu, Chenguang Zhu

This method not only addresses the distributional gap problem but also enhances the optimization process without incurring additional costs.

Instruction Following

Small energy masking for improved neural network training for end-to-end speech recognition

no code implementations15 Feb 2020 Chanwoo Kim, Kwangyoun Kim, Sathish Reddy Indurthi

More specifically, a time-frequency bin is masked if the filterbank energy in this bin is less than a certain energy threshold.

speech-recognition Speech Recognition

Look Harder: A Neural Machine Translation Model with Hard Attention

no code implementations ACL 2019 Sathish Reddy Indurthi, Insoo Chung, Sangha Kim

Soft-attention based Neural Machine Translation (NMT) models have achieved promising results on several translation tasks.

Hard Attention Machine Translation +4

Cannot find the paper you are looking for? You can Submit a new open access paper.