[Adapt] [Seminar] Is Attention Interpretable?
黄姗姗
798508656 at qq.com
Tue Sep 24 23:35:08 CST 2019
Hello Adapters,
Attention mechanisms have recently boosted performance on a range of NLP tasks, like BERT, GPT, XLNET. It is frequently assumed that attention is a tool for interpreting a model, but the paper, 'Is Attention Interpretable', finds that attention does not necessarily correspond to importance. I would show you the probing process and analysis in detail.
Time: Wed 4:30pm
Venue: SEIEE 3-414
See you there!
Best Regards,
Shanshan
_______________________________________________
Adapt mailing list
Adapt at cs.sjtu.edu.cn
http://cs.sjtu.edu.cn/mailman/listinfo/adapt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20190924/e4016415/attachment-0001.html>
More information about the Adapt
mailing list