How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers

Published in Findings of the ACL, 2022

Recommended citation: Michael Hassid, Hao Peng, Daniel Rotem, Jungo Kasai, Ivan Montero, Noah A. Smith, and Roy Schwartz. 2022. "How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers." Findings of the Association for Computational Linguistics: EMNLP 2022. https://arxiv.org/abs/2211.03495

Direct Link