1 code implementation • 9 May 2022 • Michael Neely, Stefan F. Schouten, Maurits Bleeker, Ana Lucic
The validity of "attention as explanation" has so far been evaluated by computing the rank correlation between attention-based explanations and existing feature attribution explanations using LSTM-based models.
1 code implementation • 7 May 2021 • Michael Neely, Stefan F. Schouten, Maurits J. R. Bleeker, Ana Lucic
By computing the rank correlation between attention weights and feature-additive explanation methods, previous analyses either invalidate or support the role of attention-based explanations as a faithful and plausible measure of salience.
no code implementations • NeurIPS 2018 • Xiaohan Wei, Hao Yu, Qing Ling, Michael Neely
In this paper, we show that by leveraging a local error bound condition on the dual function, the proposed algorithm can achieve a better primal convergence time of $\mathcal{O}\l(\varepsilon^{-2/(2+\beta)}\log_2(\varepsilon^{-1})\r)$, where $\beta\in(0, 1]$ is a local error bound parameter.