Search Results for author: Matthew Raffel

Found 3 papers, 3 papers with code

Simul-LLM: A Framework for Exploring High-Quality Simultaneous Translation with Large Language Models

1 code implementation7 Dec 2023 Victor Agostinelli, Max Wild, Matthew Raffel, Kazi Ahmed Asif Fuad, Lizhong Chen

Large language models (LLMs) with billions of parameters and pretrained on massive amounts of data are now capable of near or better than state-of-the-art performance in a variety of downstream natural language processing tasks.

Machine Translation NMT +1

Implicit Memory Transformer for Computationally Efficient Simultaneous Speech Translation

1 code implementation3 Jul 2023 Matthew Raffel, Lizhong Chen

Experiments on the MuST-C dataset show that the Implicit Memory Transformer provides a substantial speedup on the encoder forward pass with nearly identical translation quality when compared with the state-of-the-art approach that employs both left context and memory banks.

Translation

Shiftable Context: Addressing Training-Inference Context Mismatch in Simultaneous Speech Translation

1 code implementation3 Jul 2023 Matthew Raffel, Drew Penney, Lizhong Chen

Transformer models using segment-based processing have been an effective architecture for simultaneous speech translation.

Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.