Search Results for author: Rigel Swavely

Found 3 papers, 3 papers with code

N-Grammer: Augmenting Transformers with latent n-grams

2 code implementations13 Jul 2022 Aurko Roy, Rohan Anil, Guangda Lai, Benjamin Lee, Jeffrey Zhao, Shuyuan Zhang, Shibo Wang, Ye Zhang, Shen Wu, Rigel Swavely, Tao, Yu, Phuong Dao, Christopher Fifty, Zhifeng Chen, Yonghui Wu

Transformer models have recently emerged as one of the foundational models in natural language processing, and as a byproduct, there is significant recent interest and investment in scaling these models.

Common Sense Reasoning Coreference Resolution +5

Sequence-to-Sequence Piano Transcription with Transformers

2 code implementations19 Jul 2021 Curtis Hawthorne, Ian Simon, Rigel Swavely, Ethan Manilow, Jesse Engel

Automatic Music Transcription has seen significant progress in recent years by training custom deep neural networks on large datasets.

Information Retrieval Music Information Retrieval +2

Cannot find the paper you are looking for? You can Submit a new open access paper.