11.Attention is not Explanation (2019)

Abstract

Attention is widely used nowadays!

But, unclear what relationship exists between (1) attention weights & (2) model outputs

perform extensive experiments across a variety of NLP taks…..

\(\rightarrow\) No meaningful explanations