Search Results for author: Daniel Rotem

Found 2 papers, 1 papers with code

How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers

1 code implementation7 Nov 2022 Michael Hassid, Hao Peng, Daniel Rotem, Jungo Kasai, Ivan Montero, Noah A. Smith, Roy Schwartz

Our results motivate research on simpler alternatives to input-dependent attention, as well as on methods for better utilization of this mechanism in the Transformer architecture.

Cannot find the paper you are looking for? You can Submit a new open access paper.