[Adapt] [Seminar] XLNet: Generalized Autoregressive Pretraining for Language Understanding

Brandon 2275135452 at qq.com
Wed Sep 11 11:17:21 CST 2019


Hello Adapters,


Pretrained language model has achieve success in a lot of tasks of NLP. Today I'm going to introduce the paper "XLNet: Generalized Autoregressive Pretraining for Language Understanding", which introduces a new pretrained lanuage model called XLNet. XLNet outperforms BERT on 20 tasks. I will briefly talk about the ideas of XLNet and compare it with BERT.
Time: Wed 4:30pm
Venue: SEIEE 4-414


See you there!



Best Regards,
Bran
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cs.sjtu.edu.cn/pipermail/adapt/attachments/20190911/de9cee75/attachment.html>


More information about the Adapt mailing list